Tuesday , 8 October 2024
Home Society: News, Comment & Analysis South Korea Reflects: Deepfake Sex Crimes Plunge Women into Panic

South Korea Reflects: Deepfake Sex Crimes Plunge Women into Panic

34
Korea

The main perpetrator of South Korea’s notorious “Nth Room” case, Cho Ju-bin, is still serving his sentence, and nearly three years have passed since the implementation of the “Nth Room Prevention Law,” which strengthened penalties for digital sex crimes. However, South Korea is now gripped by fear due to sex crimes facilitated by deepfake technology. On social media, especially in Telegram chat rooms, deepfake pornography targeting women is being widely disseminated, affecting numerous minors, female college students, female teachers, female soldiers, and others. The Hankyoreh newspaper reports that deepfake sex crimes have left South Korean women “in a state of panic.” The Korea Women’s Association recently issued a statement saying that women have lost their sense of safety in everyday life, feeling as if they are “living in a state without a country.” Recently, Korean media and experts have begun reflecting on the complex causes behind the wave of deepfake sex crimes, citing new risks brought by the development of artificial intelligence, the lack of sex education for Korean youth, weak legal awareness, inadequate penalties, insufficient manpower in state agencies, and structural gender discrimination.

Deepfake Technology + Bot Distribution + Encrypted Transactions

In April of this year, The Korea Times reported that as digital technology plays an increasingly significant role in people’s daily lives, over a quarter of sex crimes in South Korea are now committed using digital devices and occur online. According to a report by Statistics Korea, the number of digital sex crimes has more than doubled for two consecutive years. The Korea Herald noted that some criminals are now exploiting new features of online platforms in increasingly malicious ways, making digital sex crimes harder to prevent and combat.

Deepfake technology, a product of the rapid advancement of generative artificial intelligence (AI) in the past two years, is the latest tool used in digital sex crimes. The Asia Business Daily analyzed that Korea’s “Generation Z” has a strong acceptance of generative AI and generally holds a positive attitude toward it. A recent survey conducted by Samsung among Generation Z office workers in South Korea, the United States, the United Kingdom, France, and Germany found that 80% of Korean respondents chose AI as their top choice when they needed help with their work, significantly higher than in other countries. Experts believe this indicates both the wide range of AI’s influence in South Korea and the need to be aware of its potential risks.

According to Sisain magazine, as of August 31, deepfake technology is widely used to create pornographic content on many overseas websites. Some websites can generate pornographic content just by inputting a social media link of the person to be synthesized, even offering features to adjust the person’s body shape, with the entire process taking less than two minutes. Since the establishment of the Digital Sex Crime Victim Support Center by the Korea Women’s Human Rights Institute in 2018, 2,154 victims of deepfake sex crimes have been helped as of August 25 this year, with 781 seeking assistance just this year. At this rate, the number of victims could exceed 1,000 this year. A study by a U.S. cybersecurity company shows that more than half of the victims of deepfake sexual exploitation worldwide are Korean.

Professor Yoo Seung-cheol (phonetic) from Ewha Womans University’s Media Convergence Department stated that beyond content production, the distribution and dissemination of illegal content in cyberspace have also changed, which is why digital sex crimes are exponentially increasing while remaining “hidden.” Sisain magazine reported that Telegram’s encrypted chat function allows it to evade tracking by investigative agencies, making it a haven for digital sex crimes. After producing deepfake pornographic content, perpetrators use simple programming to distribute the content via bots. As of August 28, Seoul police have identified eight such bots and launched investigations. These bots can serve up to 400,000 people simultaneously.

Behind the wave of deepfake sex crimes is a profit chain involving cryptocurrency. Asia Today reported that operators of Telegram chat groups initially produce and distribute deepfake porn videos for free, then later start cultivating members and making money. Users can initially create deepfake porn content for free but must pay in Telegram’s virtual currency for further synthesis. If they lack virtual currency, they can earn it by inviting new users. In April of this year, Telegram began paying advertising revenue to group owners with over 1,000 subscribers, with the revenue only payable through cryptocurrency, providing thorough anonymity for users. Currently, one illegal deepfake content creation channel on Telegram has over 270,000 subscribers.

The Segye Ilbo reported that after the incident was exposed, chat room operators continued their activities in an organized manner. When some chat rooms were shut down, they reopened under different names, such as “shelters.” Users joined through invitation links, and a chat room could instantly exceed 2,000 participants. In one chat room, a user even incited others, saying, “Even if the news comes out, don’t be afraid. Let’s humiliate acquaintances and the journalists writing the reports.”

The Hankyoreh newspaper reported that most victims of deepfake sex crimes experience the pain of seeing their faces synthesized into pornographic images and videos. However, in Korean society, the severity of deepfake sex crimes is greatly underestimated. Even after the widespread exposure of these crimes, many Korean netizens expressed disbelief, saying, “I don’t understand why this crime causes so much harm,” or “If it’s just a few people making it themselves, the harm should be minimal, right?” Seoul National University media information professor Kim Soo-ya (phonetic) noted, “The digital sex crime legal system focuses on whether the victim feels sexual shame, but people may see the crime as less severe than illegal filming because the body in the synthesized image doesn’t really exist.” She emphasized, “But whether the image is real or not, as long as such an image exists or rumors spread, the victim will suffer.”

70% of Perpetrators are Teenagers Aged 10 to 19

One concerning aspect of the recent wave of deepfake sex crimes is the extensive involvement of minors. South Korean President Yoon Suk-yeol stated on August 27 that many of the victims exposed in these incidents are minors, and the majority of the perpetrators are teenagers. According to a report by Yonhap News Agency on August 30, nearly 60% of victims in deepfake sex crime cases over the past three years in South Korea were minors, and among those indicted for producing fake pornographic videos, 75.8% were aged 10-19 last year, with the figure at 73.6% from January to July this year.

“This isn’t something that suddenly erupted among teenagers; it has always existed,” said Lee Myung-hwa (name phonetically translated), director of the Seoul Youth Culture Center. “Deepfakes are spreading like a gaming culture.” According to The Korea Times, despite efforts by the Ministry of Education, school violence incidents continue to surge. Among the types of school violence, physical violence and verbal abuse are most common, followed closely by sexual violence and cyberbullying.

Currently, some parents of deepfake case perpetrators are beginning to take action to protect their children. According to Newsis on August 28, after information about the regions and schools involved in deepfake sex crimes was exposed, the perpetrators’ names also began circulating on social media. Some parents, suspected to be those of teenage male students, are reportedly hiring companies to remove information about their children and the illegal deepfake images they produced from social media.

The Korea Herald reported that the distortion of sexual values and weak legal awareness among minors have contributed to the spread of such content. Sociology professor Heo Chang-deok (name phonetically translated) from Yeongnam University stated that discussing sex remains taboo in Korean society, and people avoid talking about sexual issues both at home and in the classroom, leading to a lack of proper understanding of sex among the younger generation. Many young people learn about sex through the internet and pornography, which often results in distorted and unhealthy attitudes towards sex. According to the Ministry of Education’s guidelines, the current annual sex education time for elementary, middle, and high schools in South Korea is 15 hours. The Ministry of Education does not provide standardized guidance for local sex education, and these classes are often taught by teachers of subjects like Korean, English, or Physical Education. Additionally, there is insufficient legal education on sexual crimes in official media and schools, which means that the exposure of online cases often only triggers students’ curiosity, making them more likely to imitate or participate in digital sex crimes.

Recently, South Korean experts have called for more systematic sex education in schools and emphasized the need to update curricula to reflect current social issues. Sociology professor Shin Kyung-ah (name phonetically translated) from Hallym University suggested integrating gender equality education into regular school curricula and teaching such courses more frequently, for instance, once a month. She stated, “Students are exposed to digital devices from an early age, so this education should start early, ensuring they receive consistent guidance from lower grades in elementary school.”

“Ultimately, isn’t the fundamental measure to prevent deepfake sex crimes through education that teaches people not to become perpetrators?” wrote Ha Min-hye (name phonetically translated), director of the Korea Future Technology and Culture Institute, in an article on September 2. She argued that the involvement of so many minors in deepfake sex crimes indicates a failure to implement digital citizenship education and media literacy education in South Korea. In her view, understanding “what technology should and shouldn’t be used for” is more important than mastering the technology itself. In comparison, more and more American universities are requiring freshmen to receive media literacy training, including how to use media wisely and how to write appropriate comments, but these skills are not being taught in South Korea.

“Most Participants Were Not Punished”

According to South Korea’s Asia Economy, after the government announced a strong response to deepfake sex crimes, an online community discussing how to handle investigations emerged. Many perpetrators or their parents expressed concerns in the community. One user posted, “My son is in a deepfake chat room, is this okay?” A management-level user in the community claimed, “Something similar happened four years ago. Except for the main perpetrator and a few accomplices, most participants were not punished… The kids in the chat room can relax.”

“In the UK and the US, producing deepfake content is punishable. In South Korea, only distribution is punished,” reported The Chosun Ilbo. According to Article 14, Clause 2 of South Korea’s Act on Special Cases Concerning the Punishment of Sexual Crimes, anyone who obscenely edits, synthesizes, or distributes another person’s face, body, voice, or video will face up to five years in prison or a fine of up to 50 million won. However, if it cannot be sufficiently proven that the perpetrator intended to distribute the content, simply producing and viewing it is not punishable. In contrast, the UK Ministry of Justice announced in April this year that anyone who creates pornographic content using deepfake technology will be convicted, regardless of whether it is shared or distributed. In July this year, the U.S. Senate also passed a bill to compensate victims of deepfake content. On September 1, the approval of proposals to combat deepfake-related issues by the California State Legislature also garnered attention from Korean media.

In fact, after the exposure of the “Nth Room” case, the South Korean National Assembly passed the Information and Communications Network Act amendment, known as the “Nth Room Prevention Law,” which came into effect in December 2021. Although the amendment has been in place for nearly three years, there has been ongoing discussion about its limited effectiveness. On August 28, a co-perpetrator in the “Seoul National University Illegal Synthetic Pornography Case,” where the photos of dozens of women, including university alumni, were illegally synthesized into pornographic content, was sentenced to five years in prison in the first trial. This was the first sentencing in the case, but there has been criticism that the punishment was too lenient. On August 29, South Korea’s ruling party, the People Power Party, and the government decided to legislate stricter punishments for deepfake sex crimes, including considering raising the maximum sentence for distributing deepfake pornographic videos from five to seven years and lowering the minimum age standard for “juvenile offenders” (those aged 10 and above but under 14, who are not subject to criminal punishment). The Ministry of Education recently established an “Emergency Task Force for Responding to School Deepfake Incidents” and announced that violating students could face expulsion.

According to Asia Today, the situation regarding deepfake sex crimes is severe, but the budget for the Digital Sex Crime Victim Support Center is expected to be further reduced next year. Yonhap News reported on September 2 that although there may be tens of thousands of digital sex crime perpetrators in South Korea, the number of investigators in some regions has actually decreased. As of June this year, there were only 131 members in the network sex crime investigation teams across 18 cities and provinces in South Korea.

The Hankyoreh cited South Korean women’s rights expert Kim Soo-jung (name phonetically translated), stating that women are targeted in deepfake pornographic content “because not only in daily life but also in public spheres, the discriminatory perception that women are not seen as equal beings is at play.” The Korea Women’s Associations United criticized that this collective crime is an extension of gender discrimination, “The government had previously even attempted to abolish the Ministry of Gender Equality and Family, ignoring the social suffering of women.” The organization called for, “The government should establish a gender equality department and formulate cross-departmental, comprehensive long-term and short-term measures to address the issue of structural gender discrimination.”

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

Minimalism

CNN: Why is Minimalism Trending?

Last week, Trader Joe’s restocked its $2.99 mini tote bags, which were...

Department Store

After World War I, Film and Media Gave Rise to “Fast Fashion”

In today’s consumer market, “fast fashion” occupies a significant share, and its...

Adoption fraud 2

Adoption Fraud Splits Korean Families Across Generations

Investigations have found that the South Korean government, Western countries and adoption...

Aging Japan

Aging Japan: Can It Overcome New Immigration Challenges?

Over the past 50 years, Japan’s government has made a near 180-degree...