The students and stans saving K-pop idols from deepfake porn

When Momoland's Nancy was targeted, South Korean students asked their government to step in. Then these groups took matters into their own hands.

by Kelly Nguyen
|
02 June 2021, 10:59am

Deepfakes -- digitally altered videos in which AI is programmed to replace the face of the subject with someone else’s -- are perhaps the most dangerous weapon of technological warfare against women in today’s virtual hellscape. Moulded with familiar misogyny and then refined through advanced technology, pornographic deepfakes pose a never-ending, humiliating threat. It was an issue that first targeted Hollywood actors but now, with K-pop a global phenomenon, idols are increasingly being turned into porn stars without their consent.

It’s not uncommon, among the sea of thirst tweets and barrages of fancams in the K-pop stan Twitter milieu, to see accounts spamming trending hashtags with deepfake porn videos. The dead eyes of distorted South Korean music industry stars stare back at viewers as their likeness is edited to artificially engage in the often hardcore sex videos. In fact, in a 2019 study by Sensity AI -- a research company following the Internet’s rapidly growing deepfake landscape -- researchers found that 96% of deepfakes in existence online were non-consensual porn. 100% of videos on the top five deepfake pornography websites featured female subjects.

But deepfakes weren’t invented with the intention of creating a dumpster fire of politics, privacy and women’s rights. Back in the 90s, academics at the University of California, Berkeley utilised an early form of the technology to innocently match mouth movements with dubbed film audio. The term “deepfake” wasn’t coined until 2017, named after the eponymous Reddit user “u/deepfakes”, who ensured things very quickly went to shit while gaining notoriety for superimposing the faces of celebrities including Gal Gadot onto faux-cest porn.

Fast-forward to today, and while some people are using deepfake technology to doctor amusing skits of Greta Thunberg and forward-thinking Troye Sivan music videos, there are also those willing to violate their favourite idols’ privacy for their own pleasure. When compromising photographs of 21-year-old K-pop star Nancy from the girl group Momoland were taken without her knowledge in January 2021, these images were manipulated using deepfake technology. It was the first time the Korean general public became aware of the thus-far underground K-pop deepfake problem, and citizens quickly began circulating a petition addressed to their presidential office. Officials responded, explaining that they were working to track down creators through messaging platforms like Telegram and Discord, and Nancy’s entertainment agency released a statement saying they’d be taking legal action. In a digitally dystopian fashion, Nancy quickly returned to fans’ screens to promote her group’s latest release, smiling and dancing along to choreographed moves as though nothing was wrong. At the time of writing, her agency has yet to update fans on the case.

“It feels like everyone is looking at your body… it becomes difficult to think about yourself being seen in that way by people you didn't choose to see you that way.” – Danielle Citron, Cyber Civil Rights Initiative

While it might have shocked the general public, the attack on Nancy came as no surprise to Jiyeon Jeon, a student researcher specialising in women and technology at Ewha Womans University in Seoul. “I think many South Korean women [in particular] are even more sensitive to these deepfake videos because we have a shared trauma against tech sex crimes,” Jeon says. She’s talking about molka -- the epidemic of men hiding cameras in public changing rooms and bathrooms in order to record women and girls and create violating voyeuristic content. It's something that has plagued the minds and bodies of women in South Korea for decades, according to Danielle Citron, a law professor at the University of Virginia’s School of Law and vice president of the Cyber Civil Rights Initiative.

With police officials making light of molka, and it looking an awful lot like nobody had their backs, young women in South Korea started their own revolution. In her work studying the country’s radical feminist movement at Seoul National University, researcher Mikayla Neyens explains how women quickly banded together to create safe spaces in the aftermath of the “Nth room” cyber sex ring -- where thousands of perpetrators blackmailed and extorted young women and girls into performing sexually-explicit acts on camera. The ring was also the source of many K-pop idol deepfakes, created and circulated within their chat rooms.

In response, young feminist communities soon started Twitter accounts like @ProjReset and @DigitalCrimeKRS, to circulate information and petitions, as well as tracking down and publicly exposing perpetrators of digital sex crimes themselves. “Without persistent petitions and Twitter trends, the [Nth room] issue would not have grown to the point it did, resulting in real consequences for the perpetrators and national media attention and condemnation,” Neyes says. But the revolution didn’t stop there. The college journalists who worked to expose crimes related to the Nth room continue to investigate cyber sexploitation on their popular YouTube channel, and female students at Ewha University have engineered groundbreaking deepfake detection tools. The system they developed can detect deepfakes with close to 100% accuracy and performs at twice the speed of other detection models on the market.

Deepfakes, of course, are a form of cyber violence used to control, exploit and humiliate women; something Citron says is kindled by the patriarchal South Korean culture. More broadly, it’s indicative of global misogyny writ large. “You're co-opting someone's identity without their permission, so you're stealing their sexual agency,” she says. “You're coercing sexual expression; you're affixing them with a damaged identity.” Having worked with survivors of such abuse for years, including those in the public eye, she notes the difficulty they have in separating themselves from the resulting images. “It feels like everyone is looking at your body,” she says. “It's just your face -- not your breasts, your genitals. It's really hard [for many people] to disaggregate that, and it becomes difficult to think about yourself being seen in that way by people you didn't choose to see you that way.”

“The underlying assumption is that because these videos are fake, nobody is getting hurt and thus it’s not illegal, which is why, I think, these videos are so omnipresent.” – Jiyeon Jeon, Ewha Womans University

On deepfake creation forums Citron has researched, there’s a racialised aspect to the digital manipulation; pushing dehumanising, demeaning and fetishising tropes of Asian women onto popular K-pop idols. Naturally, this impacts the mental wellbeing of not just the victims of such attacks, but other women too; by perpetuating the idea that agency as a woman online -- especially a woman of colour -- can easily become nothing more than a facade.

Frustratingly, K-pop’s global popularity in recent years has bred a large-scale market for sellers to continue feeding fantasies of female idols, forgoing acknowledging them as actual people, all for a profit. “It’s one thing wanting to watch porn, but a totally different thing to want to watch your favourite idol in the porn without their consent,” Jeon adds. Disturbingly, she has even seen videos using the faces of underage and barely legal K-pop stars traded online. “The underlying assumption is that because these videos are fake, nobody is getting hurt and thus it’s not illegal, which is why, I think, these videos are so omnipresent.”

So, what legal action can be taken against the creators and distributors of these videos, when they’re currently allowed to simply exist in their infinitely re-playable online state?

Through her work, Citron is determined to change Section 230 of the U.S. Communications Decency Act, which provides immunity for website platforms. In its current form, they’re not held responsible for third-party pastimes -- whatever their users do is simply not their problem. While there are currently laws in place criminalising deepfake creation in Korea, their government has a difficult time enforcing them once videos are let loose online. Elsewhere, creators break laws without consequence. Currently, one of the largest websites hosting thousands of K-pop deepfakes uses a server located in the United Kingdom, making it near-impossible for South Korean lawyers to take action against creators.

Some of the most frequently deepfaked girl groups include TWICE and BLACKPINK, whose stans are taking matters into their own hands in order to seek justice. A K-pop girl group stan Twitter account told us how communities on the platform have banded together to mass report deepfake spreader accounts, especially after their emails to entertainment agencies went unanswered. "We couldn’t just be mad at the companies that did nothing," they said, “[so we] shared the account IDs with hundreds of K-pop fans and deleted them".

This perpetually fucked means of controlling women is a sobering reminder of reality, but it has also ushered in a renewed sense of community -- where women rely on other women for their freedom. It’s exactly what female students at South Korea’s Pohang University of Science and Technology, or POSTECH, are tackling through their Female Student Union. While accepted in the mainstream elsewhere, the term ‘feminism’ is often met with vitriol in South Korea. Like many feminist organisations throughout the nation, the Female Student Union has faced resistance over the years -- their digital sex crime educational events have been shut down because misogynistic critics spewed loud and ugly reverse-sexism spite, and the organisation’s purpose is routinely questioned by campus administration. The group refused to be deterred. They rely on each other to provide indispensable support and resources and stand in solidarity, so women no longer have to quietly suffer misogyny’s brutality.

They’re taking back the right to live their lives and empowering other women to do the same. “We hope to live a life that is equally respected,” Female Student Union vice-president Ji Seon Lee says. Cyborg Girl Summer be damned! South Korean women will not stand for their perception and personhood to be owned by anyone but themselves; they’re rejecting the panopticon of digital damnation and instead are manufacturing their own futures -- all with a simple goal in mind. “Before being women,” says Lee, “we want to be treated as human beings.”

Follow i-D on Instagram and TikTok for more on K-pop.

Tagged:
Technology
South Korea
K-Pop