Archived: Deepfake Porn Victims Are Seeking Federal Protections Through Legislation

This is a simplified archive of the page at https://www.teenvogue.com/story/deepfake-porn-victims-are-seeking-federal-protections-through-a-new-bill

Use this page embed on your own site:

Deepfake porn victims are speaking out about the lack of protections against abusers. A new bill could change things.

identity, consent, sexual violence, textbelowcentergridwidth, web, tagsReadArchived

When Lauren* watched a video of her having sex with a man she had never been intimate with, she had a panic attack. Though the body of the woman in the video was thinner and whiter than hers, it was undeniably her face. She couldn’t stop crying. How could it look so realistic? And what could she do about it?

As the capabilities of artificial intelligence rapidly increase, so do the ways people are targeted by it. According to Psychology Today, “[a 2019] study reports that 96% of deepfake victims are sexualized, and nearly all of them are women.” Lauren had found herself, as many others have, the victim of deepfake porn – an increasingly popular subset of pornography where a person’s face is superimposed onto another person’s body and then seen engaging in sexual activity. The creators of deepfake porn don’t have to be computer geniuses – the videos can be created with relatively simple face swapping apps available on most phones. Newer, more advanced AI tech makes the phenomenon even more likely to proliferate. 

Lauren, whose name has been changed for privacy concerns, was the subject of deepfake porn after Dan*, a man she met at the local gym asked her out on a date. When she told him she wasn’t interested, she didn’t think much of it but Lauren says the next time she saw Dan at the gym, the situation escalated. “He became aggressive and frustrated, saying I should just give him a chance,” Lauren said. I should have told the gym management but I was embarrassed and just wanted to get out of there.” When Lauren returned to the gym again, she said a man followed her toward the locker rooms and said he needed to speak to her. “He told me Dan was showing people a video of us having sex,” she said. “I didn’t believe him at first because I’d never had sex with Dan so it didn’t seem possible.”

But thanks to the rise in deepfake and face swap technology, it was possible. Dan had created a video that appeared to show him and Lauren having sex and was showing the men at the gym, Lauren said. A few days later, Dan posted the video on his Instagram story, using Lauren’s first and last name and bragging that he would turn the fake video into a reality. After reporting Dan to the management of the gym, Lauren said his membership was canceled and he was banned from returning but she didn’t want to return either. “I didn’t want to go knowing that a bunch of the guys had seen porn of me,“ she said. “Even though it was fake, it still made me feel really ashamed and gross.” At the advice of a loved one, Lauren turned to a lawyer who advised her she didn’t have a case for suing for defamation. Lauren said she was told she couldn’t sue under revenge porn laws because it wasn’t technically revenge porn – it was deep fake porn, which there are currently no federal laws against, as the law lags behind technological advances.

While some states like Virginia and California have passed laws targeting deepfake pornography, the lack of federal protection can leave victims without legal recourse. Honza Cervenka, a lawyer who specializes in nonconsensual pornography, told Refinery29 that for these videos and images to be considered image-based sexual abuse, the breasts or genitals of the person would have to be shown, which is often not the case in deepfake pornography. “It sort of falls through the cracks of many of the laws that were written with the original revenge pornography, rather than this more sophisticated deepfake imagery,” Cervenka said. 

Uldouz Wallace, an Iranian actress, was one of the stars targeted by the 2014 iCloud hack, in which private photos of celebrities including Kirsten Dunst, Jennifer Lawrence, and Kate Upton, were leaked online. Wallace, who was 25 when her private photos were hacked, watched in the ensuing years as deepfake pornography was made from her photos. “It’s several layers of different types of abuse,” Wallace said, “with the deepfake aspect of it after the whole initial hack and leak. There’s just so much [fake content] now that I don’t even know what’s what.”

Wallace is now affiliated with the Sexual Violence Prevention Association (SVPA) an organization that uses “advocacy, education, and community engagemen”to “create a world where everyone can live free from the threat of sexual violence.” In an open letter, SVPA is calling on Congress to ban deepfake porn. “Right now, there are no [federal] laws banning the creation or distribution of deepfake porn,” the letter reads. “Until there are consequences, deepfake pornography will continue to increase.” 

Omny Miranda Martone, the Founder & CEO of SVPA, said the organization is committed to helping pass federal legislation against deepfake pornography and educating people on why it’s so harmful. “People are like, well, why do [victims] even care? It’s not real anyways. It’s not actually them,” Martone said. “I don’t think a lot of people fully understand the consent piece of this – that you don’t have the person’s consent and this is a violation of autonomy and privacy.”

As the use of artificial intelligence and deepfake technology becomes even more common – there is an increasing need for standards for protection to be set through bills like the Preventing Deepfakes of Intimate Images Act which was introduced by Rep. Joseph Morelle (D–N.Y.). “As artificial intelligence continues to evolve and permeate our society, it’s critical that we take proactive steps to combat the spread of disinformation and protect individuals from compromising situations online,” Morelle said. As of publication, the bill has not advanced through the House of Representatives.

Let us slide into your DMs. Sign up for the Teen Vogue daily email.