South Korea Is Facing Deepfake Porn Crisis
A Telegram channel with more than 220,000 participants was reportedly being used to create and share these artificial intelligence-generated pornographic images, serving as a global reminder of the darker consequences from the widespread rush of AI.
South Korea has been rocked by a wave of sexually explicit deepfake images being created and shared online, indiscriminately targeting women and girls using their school photos, social media selfies and even military headshots. A Telegram channel with more than 220,000 participants was reportedly being used to create and share these artificial intelligence-generated pornographic images, serving as a global reminder of the darker consequences from the widespread rush for this technology.
Users could upload photos and in a matter of seconds create explicit content with the faces of their friends, classmates or romantic partners. The initial reporting from local news outlets prompted more women to come forward as victims and more Telegram channels being exposed, revealing the true scale of the issue. Many of the victims involved are minors. Protesters wearing white masks over their eyes gathered in Seoul calling for justice late last month.
As outrage mounted, South Korean President Yoon Suk Yeol called on his government to crack down on the digital abuse, and authorities said they would form a task force to tackle the issue. While police investigated, another disturbing trend emerged. Preliminary data indicate a vast majority of the suspected perpetrators from the recent wave of cases were in their teens.
"Some may dismiss it as a mere prank, but it is clearly a criminal act that exploits technology under the shield of anonymity," Yoon said during a cabinet meeting late last month, acknowledging that many of both the victims and perpetrators were minors.
Yoon is right. This isn't just a prank, and the impact of this type of digital sexual violence can be devastating for victims. A lot of the blame is rightly being placed on Telegram, especially as the scandal is coming to a head at the same time as the platform's Chief Executive Officer Pavel Durov has been arrested and charged in France for alleged complicity in crimes, including the sharing of child pornography, committed on his app. Korean authorities said Telegram is cooperating with the investigation and requests to remove content.
But Yoon's words will ring hollow for some, given he came to power in 2022 courting young male voters with proposals to scrap the gender equality ministry, which he accused of treating men like "potential sex criminals." He also claimed that systemic gender discrimination does not exist in South Korea, and implied feminism was to blame for the country's low birth rate. Meanwhile, women earn some 30% less than their male counterparts, marking the highest gender wage gap in the developed world. Even in dual-income households, women bear the brunt of housework and childcare responsibilities.
And before the recent crop of AI tools made generating deepfake explicit images much easier, advocacy groups had been drawing attention to a rash of digital sex crimes, usually involving non-consensual intimate images or hidden cameras.
There is a host of reasons that South Korean women, like their counterparts across the developed world, are choosing not to have children, including rising participation in the labor force and the uneven toll of raising children. Rather than blame feminism, it might be more useful to look at this mountain of deeply troubling data. Ironically, one of the main Telegram groups spreading these images reportedly had some 227,000 members - that's roughly on par with the number of babies born last year.
South Korean women are boldly bringing this issue to light with protests and activism, the first step in enacting change for a crisis that is impacting people around the globe. Compared to other jurisdictions, the nation is also ahead of the curve when it comes to regulating deepfake porn. It actually has laws in place, including punishments of up to five years in prison and fines for people convicted of creating images with the intent to distribute. In the US, federal legislation has garnered bipartisan support among lawmakers but is still making its way at a snail's pace through Congress.
Regulation is important, but the cases in Korea also expose how difficult enforcement can be for such a widespread problem, as well as how easy it is for this content to be created and shared in the first place. Expanding laws to make it illegal to possess such material could help. But this still puts a lot of responsibility on victims to digitally trace back who created the content that upended their lives.
The Telegram CEO's recent arrest shows there's increased global momentum to hold tech companies accountable for their role in harms taking place on their platforms. This is a step in the right direction; as more regulators shift responsibility to these powerful companies, more industry-led solutions would be welcome. Universities and research centers are developing some promising tools to protect images from AI manipulation. But not enough resources are being devoted to ensuring the responsible rollout of AI from the industry creating it.
More onus must be put on tech companies to come up with proactive solutions. Many tech giants have pulled back on ethical and responsible AI teams in recent years amid broader cost-cutting. But the dignity of women and girls isn't something that should be sacrificed in favor of profit.
South Korea is the current epicenter of this crisis. But it is a global issue that can impact anyone from celebrities like Taylor Swift to middle school girls from Seoul to New Jersey. The tech sector cannot keep skirting responsibility much longer.
(This story has not been edited by NDTV staff and is auto-generated from a syndicated feed.)