Tuesday, April 09, 2024

Seeing fake porn of myself ‘shocking’, says AOC as she launches AI bill

Raoul Simons
Tue, 9 April 2024 

The new bill backed by Ms Ocasio-Cortez will make it easier to prosecute deepfakes - Michael Reynolds/Shutterstock

Alexandria Ocasio-Cortez has demanded a crackdown on AI after viewing a deepfake pornographic image of her likeness performing a sex act. The New York congresswoman, 34, was “shocked” when she spotted the digitally altered image on X.

Ms Ocasio-Cortez has repeatedly been targeted with manipulated images and fake social media posts using her likeness, since being elected in 2018 as the youngest woman to serve in Congress.

However, new AI tools have made the creation process much easier and the advanced technology makes images and videos seem more realistic.

As a result, Ms Ocasio-Cortez has been involved in crafting a new law intended to bring an end to non-consensual, sexually explicit deepfakes. It would enable victims to take legal action against the producers and distributors of the content.
‘Get this off my screen’

Talking about her own experience with the porn deepfake in February, she told Rolling Stone magazine her first thought was: “I need to get this off my screen.”

“There’s a shock to seeing images of yourself that someone could think are real,” she said. “As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma.

“It’s not as imaginary as people want to make it seem. It has real, real effects not just on the people that are victimised by it, but on the people who see it and consume it. Once you’ve seen it, you’ve seen it.

“It parallels the same exact intention of physical rape and sexual assault, which is about power, domination, and humiliation. Deepfakes are absolutely a way of digitising violent humiliation against other people.”

AI images of Taylor Swift

Taylor Swift suffered a similar experience in January, when sexually explicit AI-generated images of the singer surfaced on social media.

Fabricated pornography accounted for 98 per cent of all deepfake videos posted online, according to a 2023 study by Home Security Heroes, a cyber security firm.

The proposed US law change, which has bipartisan support, would amend the Violence Against Women Act so that people can sue those who produce, distribute or receive the deepfake pornography, if they “knew or recklessly disregarded” that the victim did not consent to those images.

If the bill passes the House and Senate, it would become the first US federal law to protect victims of deepfakes.

“A lot of my work has to do with chain breaking, the cycle breaking, and this, to me, is a really, really, really important cycle to break,” added Ms Ocasio-Cortez.

No comments:

Post a Comment