Now Reading
Female Content Creators Stand Up Against Deepfake Porn

Female Content Creators Stand Up Against Deepfake Porn

Controversy sparked once again on the entertainment platform Twitch this week after a popular streamer who goes by the name Atrioc was discovered to have purchased deepfake porn of fellow streamers. In response, female streamers who have been affected by the non-consensual sexual content containing their likeness have begun to speak out against deepfake porn.

Deepfakes refer to the technology which has allowed for the digital alteration of an actor’s face or body so that they appear as someone else. In the case of deepfake porn, popular female content creators’ faces were digitally inserted onto bodies that could presumably pass as their own, in order to create sexually explicit content without the creator’s consent.

The AI technology has been outlawed in three states, including Texas, Virginia, and California, but no federal ban on the technology has been proposed as of yet. Though the threat of deepfakes affects public figures more prominently, private citizens have also been victims of what activists call image-based sexual abuse.

Deepfakes have gotten national attention for their power to spread political misinformation; however, research shows an overwhelming majority of them have been used to target women. Sensity AI, a research company that has tracked online deepfake videos since December 2018, has consistently found that between 90% and 95% of them are nonconsensual porn. About 90% of that is nonconsensual porn of women.

Victims of image-based sexual abuse have banded together to tackle deepfakes with a movement called #MyImageMyChoice. The campaign’s website serves mostly as an archive for survivor stories, as well as an information hub about the issue. A petition started by the campaign to shut down websites that host deepfake porn currently has over 51,000 signatures.

“Our private, intimate images have been shared without our consent. We’ve been secretly filmed, threatened, deepfaked. We’ve been assaulted, and footage from this has been distributed,” the movement’s website reads.

“Our images have been requested, traded, and purchased on forums, chat rooms, and porn sites. But we should all have a right to privacy and agency over images of our bodies. Intimate image abuse can have devastating, even life-threatening impacts. But most victims don’t receive support from governments, police, tech platforms, or the law. The only action they can take is to learn to move on.”

In addition to shutting down websites that host these non-consensual images, victims of image-based sexual abuse have promised legal action in response to the use of their likeness.

“I promise you. With every part of my soul, I’m going to fucking sue you,” said QTCinderella, a popular Twitch streamer affected by the deepfake issue. “That’s all I have to say.”

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
Scroll To Top