lördag 26 april 2025

The development of AI

South Korean students are being exposed to deepfake porn

Hundreds of students, pupils and school employees in South Korea have fallen victim to so-called “deepfakes” with pornographic content, CNN reports. Between January 2024 and November of the same year, more than 900 people in the school environment, mostly women, reported being exposed.

The perpetrators take photos from the victims’ social media accounts, and they are then edited onto naked bodies in photos and videos that are later distributed on messaging services such as Telegram. Many of them are created with the help of AI. Victims that CNN spoke to testify that their lives have been ruined.

Legislation to address the problem has been tightened, but the 946 cases reported last year, which do not include universities, have only led to 23 arrests.
 
Student was shocked: “Crushed my worldview”

The 27-year-old university student Ruma is one of hundreds of people in South Korea who have fallen victim to so-called deepfake porn. She tells CNN that one day she received messages with images where her face had been edited onto naked bodies and then spread.

The comments under the post on the Telegram app were derogatory and users threatened to spread the content further. She felt humiliated and violated in an “almost irrevocable” way.

“It shattered my whole worldview,” says Ruma, whose real name is something else.

The 27-year-old initially did not receive much help from the police, but an activist managed to infiltrate the chat, and the perpetrators were later sentenced to prison.
 

Inga kommentarer:

Skicka en kommentar