Life

We should all celebrate the deletion of ‘Deepfake’ celebrity porn videos

Posted by
Moya Crockett
Published

It’s a rare sign of online platforms stepping in to protect women from harassment and abuse.

One of the more disturbing stories to rear its head in recent weeks was the news that women’s faces were being virtually transplanted onto the bodies of anonymous porn actresses. No, that’s not the plot of an upcoming episode of Black Mirror: that is a real thing that’s happening.

A tool called FakeApp, developed by a not-at-all-creepy guy in the States, makes it possible to replace the faces of actresses in porn films with those of celebrities or ordinary women. Users are required to submit hundreds of photos of the person they want to see appear in a porn clip, as well as the clip itself. The app then uses an algorithm to create a computer-generated version of the subject’s face, merged with the porn actress’s body, in videos called ‘deepfakes’.

So far, so gross. But the free software, which has reportedly been downloaded more than 100,000 times since its release in January, is made even more troubling by its effectiveness. Often, it works. Screenshots of porn videos ostensibly starring women such as Emma Watson, Daisy Ridley and Ariana Grande are now floating around the internet – and if you didn’t know about the existence of deepfakes, you could be forgiven for thinking these screenshots were real. Tech site Motherboard, which first reported the deepfake trend, says that people are now using FakeApp to non-consensually create fake porn videos starring acquaintances, friends, classmates and ex-partners.

The invention of FakeApp is a deeply disconcerting development in technology that seems designed, at least in part, to humiliate and exert control over women. “Ha ha,” you can imagine users thinking, as they digitally manipulate their ex-girlfriend’s face onto a porn actress’s body. “You might have rejected me in real life, but now I can make you do whatever I want.” One can hardly conceive of how unsettling it must feel to see an AI-generated version of yourself appearing, entirely without your agreement, in a graphic porn film. 

So it is heartening to hear that Gfycat, a website that hosted many user-uploaded deepfake videos, has decided to delete them. In a statement, Gfycat said: “Our terms of service allow us to remove content that we find objectionable. We are actively removing this content.” 

On left: a FakeApp still using imagery of Gal Gadot, right. 

BBC News reports that while Reddit has yet to block FakeApp porn from its channels, other websites have taken steps to block people from sharing deepfake videos. The chat service Discord shut down a group where deepfakes were being exchanged, saying that the videos violated its rules on non-consensual pornography.

These moves to block the spread of pornographic deepfakes are encouraging, because they are relatively rare examples of internet services taking decisive action to protect women from distress and abuse. All too often, we are presented with a vision of the internet as a kind of Wild West, where rules not only do not apply, but cannot possibly be enforced; where the harassment of women is par for the course.

The most obvious example of this is Twitter, which continues to be a place where women – not least women of colour, transgender women, women who subscribe to minority religions and disabled women – are routinely subjected to unmoderated abuse. Research conducted last summer by the Fawcett Society and Reclaim the Internet found that Twitter was failing to remove messages containing abuse, threats or hate speech, even when they clearly violated the website’s community standards (in addition, users who reported these tweets frequently received no response at all).

An Amnesty investigation, meanwhile, found that more than 25,000 abusive tweets were sent to women MPs over a six-month period in 2017 (half of which were directed at Diane Abbott, the UK’s first black woman MP). Twitter has insisted that it is strengthening its anti-abuse measures, but those who spend any time at all on the website will tell you that trolling and threats are still rampant.

Then there’s Facebook. The social media giant recently introduced new measures to tackle sexual harassment and revenge porn, but chief operating officer Sheryl Sandberg openly acknowledges that the platform has not done enough to address abuse. Videos of women being attacked on the street are allowed to go viral, while black women who post about racial and gender politics have their content removed for “violating community standards”. The Guardian reports that Facebook is also currently facing a number of lawsuits from victims who say it hosted naked and compromising pictures of them. At least one teenage girl has settled out of court with the site, after naked images of her were posted on a so-called “shame page”.

Many of us now live much of our lives online, yet the internet can be a hazardous home for far too many women. Against a dark backdrop of tech company inaction – not to mention the fact that deepfakes are not currently covered by revenge porn laws – websites’ decisions to block deepfake videos should be seen a small spot of light. It makes the internet a fractionally more hospitable place for women, and that can only be a good thing.

Images: FakeApp / Rex Features

Topics

Share this article

Author

Moya Crockett

Moya is Women’s Editor at stylist.co.uk, where she is currently overseeing the Visible Women campaign. As well as writing about inspiring women and feminism, she also covers subjects including careers, podcasts and politics. Carrying a tiny bottle of hot sauce on her person at all times is one of the many traits she shares with both Beyoncé and Hillary Clinton.

Other people read

More from Life

More from Moya Crockett