Jerod Harris/Getty Images

Reddit (Finally) Bans Deepfake Communities, but Face-Swapping Porn Isn’t Going Anywhere

Photoshopping celebrities’ faces onto nude photos has been around for years. Now, new technology has spawned a more sinister upgrade to that practice—and it has online communities scrambling to keep nonconsensual adult content off their platforms.

Advertisement

This week, Reddit, Twitter, and Pornhub became the latest social platforms to ban deepfakes—fake porn videos in which artificial intelligence superimposes celebrities’ (and other people’s) faces onto the bodies of adult film actors. With free and easy-to-use facial recognition technology, users can create videos that are nearly indecipherable from reality.

Deepfakes became popular late last year on a Reddit subform, r/deepfakes, which had almost 90,000 subscribers by the time the site shut it down Wednesday and was named after the Redditor who first began to post the content. The videos targeted Gal Gadot, Scarlett Johansson, Daisy Ridley, and others. They were (obviously) posted without the women’s consent.

In addition to banning the subform, Reddit updated its community rules to make its policy on involuntary pornography more clear. The rule, which was previously combined with regulations regarding sexual content involving minors, is now its own distinct policy.

“Reddit strives to be a welcoming, open platform for all by trusting our users to maintain an environment that cultivates genuine conversation,” Reddit told Motherboard on Wednesday.

Advertisement

Reddit’s decision came on the heels of other platforms banning deepfakes. Twitter denounced the practice Tuesday, saying it violates the platform’s intimate media policy. Users found in violation of the rule, which states that intimate photos and videos cannot be produced or distributed without consent, will have their accounts suspended.

That same day, Pornhub made its stance clear. The pornography site, which gets more than 75 million visitors a day, will delete deepfakes that its users flag.

“We do not tolerate any nonconsensual content on the site, and we remove all said content as soon as we are made aware of it,” a spokesperson told Motherboard. “Nonconsensual content directly violates our (terms of service) and consists of content such as revenge porn, deepfakes, or anything published without a person’s consent or permission.”

But it’s unlikely that Pornhub will be able to fully enforce its policy. The site is deleting only videos that users flag, meaning unflagged content abounds. Motherboard reported that dozens of videos clearly labeled as deepfakes remained on the site.

Advertisement

And, of course, nothing posted online truly disappears. The tools to make deepfake videos are still easily accessible, and plenty of people know how to use them. Deepfake video makers use browser extensions like Instagram Scraper and DownAlbum to mine photos from social media that can be uploaded onto a desktop application like Porn World Doppelganger that finds suitable lookalikes. Videos on the Reddit subform showed followers how to use such technology, and when it was shut down, deepfake enthusiasts simply migrated to other chatrooms where they could continue swapping tips and tricks.

While celebrities were some of the earliest victims, people are now creating deepfake videos of friends, co-workers, classmates, crushes, and exes. The technology is set to become the latest tool for creating revenge porn—the posting of sexually explicit material to embarrass a former partner. But revenge porn and other privacy laws don’t address face-swapping content, so there is little legal recourse for people who are victims of deepfakes. Mary Anne Franks, who helped write much of the country’s laws regarding nonconsensual porn, told Wired that victims can’t sue someone for a privacy violation when it’s not technically their body or their life that’s being exposed.

Celebrities can sue for the misappropriation of their image when deepfakes are used for commercial purposes, but the best chance for someone who isn’t a public figure is to use anti-defamation law to prove emotional distress, Franks said.