SEOUL, South Korea — Three years after the 30-year-old South Korean woman received a barrage of online fake images that depicted her nude, she is still being treated for trauma. She struggles to talk with men. Using a mobile phone brings back the nightmare.
“It completely trampled me, even though it wasn’t a direct physical attack on my body,” she said in a phone interview with The Associated Press. She didn’t want her name revealed because of privacy concerns.
Many other South Korean women recently have come forward to share similar stories as South Korea grapples with a deluge of non-consensual, explicit deepfake videos and images that have become much more accessible and easier to create.
It was not until last week that parliament revised a law to make watching or possessing deepfake porn content illegal.
Most suspected perpetrators in South Korea are teenage boys. Observers say the boys target female friends, relatives and acquaintances -– also mostly minors —- as a prank, out of curiosity or misogyny. The attacks raise serious questions about school programs but also threaten to worsen an already troubled divide between men and women.
Deepfake porn in South Korea gained attention after unconfirmed lists of schools that had victims spread online in August. Many girls and women have hastily removed photos and videos from their Instagram, Facebook and other social media accounts. Thousands of young women have staged protests demanding stronger steps against deepfake porn. Politicians, academics and activists have held forums.
“Teenage (girls) must be feeling uneasy about whether their male classmates are okay. Their mutual trust has been completely shattered,” said Shin Kyung-ah, a sociology professor at South Korea’s Hallym University.
The school lists have not been formally verified, but officials including President Yoon Suk Yeol have confirmed a surge of explicit deepfake content on social media. Police have launched a seven-month crackdown.
Recent attention to the problem has coincided with France’s arrest in August of Pavel Durov, the founder of the messaging app Telegram, over allegations that his platform was used for illicit activities including the distribution of child sexual abuse. The South Korean government said Monday that Telegram has pledged to enforce a zero-tolerance policy on illegal deepfake content.
Police say they’ve detained 387 people over alleged deepfake crimes this year, more than 80% of them teenagers. Separately, the Education Ministry says about 800 students have informed authorities about intimate deepfake content involving them this year.
Experts say the true scale of deepfake porn in the country is far bigger.
The U.S. cybersecurity firm Security Hero called South Korea “the country most targeted by deepfake pornography” last year. In a report, it said South Korean singers and actresses constitute more than half of the people featured in deepfake pornography worldwide.
The prevalence of deepfake porn in South Korea reflects various factors including heavy use of smart phones; an absence of comprehensive sex and human rights education in schools and inadequate social media regulations for minors as well as a “misogynic culture” and social norms that “sexually objectify women,” according to Hong Nam-hee, a research professor at the Institute for Urban Humanities at the University of Seoul.
Victims speak of intense suffering.
In parliament, lawmaker Kim Nam Hee read a letter by an unidentified victim who she said tried to kill herself because she didn’t want to suffer any longer from the explicit deepfake videos someone had made of her. Addressing a forum, former opposition party leader Park Ji-hyun read a letter from another victim who said she fainted and was taken to an emergency room after receiving sexually abusive deepfake images and being told by her perpetrators that they were stalking her.
The 30-year-old woman interviewed by The AP said that her doctoral studies in the United States were disrupted for a year. She is receiving treatment after being diagnosed with panic disorder and post-traumatic stress disorder in 2022.
Police said they’ve detained five men for allegedly producing and spreading fake explicit contents of about 20 women, including her. The victims are all graduates from Seoul National University, the country’s top school. Two of the men, including one who allegedly sent her fake nude images in 2021, attended the same university, but she said has no meaningful memory of them.
The woman said the images she received on Telegram used photos she had posted on the local messaging app Kakao Talk, combined with nude photos of strangers. There were also videos showing men masturbating and messages describing her as a promiscuous woman or prostitute. One photo shows a screen shot of a Telegram chatroom with 42 people where her fake images were posted.
The fake images were very crudely made but the woman felt deeply humiliated and shocked because dozens of people — some of whom she likely knows – were sexually harassing her with those photos.
Building trust with men is stressful, she said, because she worries that “normal-looking people could do such things behind my back.”
Using a smart phone sometimes revives memories of the fake images.
“These days, people spend more time on their mobile phones than talking face to face with others. So we can’t really easily escape the traumatic experience of digital crimes if those happen on our phones,” she said. “I was very sociable and really liked to meet new people, but my personality has totally changed since that incident. That made my life really difficult and I’m sad.”
Critics say authorities haven’t done enough to counter deepfake porn despite an epidemic of online sex crimes in recent years, such as spy cam videos of women in public toilets and other places. In 2020, members of a criminal ring were arrested and convicted of blackmailing dozens of women into filming sexually explicit videos for them to sell.
“The number of male juveniles consuming deepfake porn for fun has increased because authorities have overlooked the voices of women” demanding stronger punishment for digital sex crimes, the monitoring group ReSET said in comments sent to AP.
South Korea has no official records on the extent of deepfake online porn. But Reset said a recent random search of an online chatroom found more than 4,000 sexually exploitive images, videos and other items.
Reviews of district court rulings showed less than a third of the 87 people indicted by prosecutors for deepfake crimes since 2021 were sent to prison. Nearly 60% avoided jail by receiving suspended terms, fines or not-guilty verdicts, according to lawmaker Kim’s office. Judges tended to lighten sentences when those convicted repented for their crimes or were first time offenders.
The deepfake problem has gained urgency given South Korea’s serious rifts over gender roles, workplace discrimination, mandatory military service for men and social burdens on men and women.
Kim Chae-won, a 25-year-old office worker, said some of her male friends shunned her after she asked them what they thought about digital sex violence targeting women.
“I feel scared of living as a woman in South Korea,” said Kim Haeun, a 17-year-old high school student who recently removed all her photos on Instagram. She said she feels awkward when talking with male friends and tries to distance herself from boys she doesn’t know well.
“Most sex crimes target women. And when they happen, I think we are often helpless,” she said.