Life

Sinister app that undresses women reminds us who ‘deepfakes’ were designed to target

Posted by
Meena Alexander
Published
backgroundLayer 1
Add this article to your list of favourites
women-target-deepfakes

Much of the conversation around ‘deepfake’ technology is focused on its ability to alter our perceptions of politicians and powerful people, but the emergence of the DeepNude app is a dark reminder of its original purpose: to control women’s bodies.

Last week, Motherboard reported on DeepNude, an app available to the public that allows users to upload any photo of a woman and ‘undress’ her using artificial technology. Promoted under the tagline “The superpower you always wanted”, its anonymous creator trained an algorithm to create these ‘nudes’ by feeding it more than 10,000 images of naked female bodies.

Days after the story was published, it was announced via Twitter that “the world is not yet ready for DeepNude” and the app would be taken down.

You may also like

We should all celebrate the deletion of ‘Deepfake’ celebrity porn videos

After major backlash, a statement said: “We created this project for users’ entertainment a few months ago. We never thought it would become viral and we would not be able to control the traffic.”

The creator told The Verge that he believed someone else would have developed the app if he hadn’t done so first – “the technology is ready,” he said. 

And therein lies the true horror – that one dingy corner of the deepfake industry has been shut down, but several copycat programs will inevitably spring up in its place.

I am sure of this because, despite the majority of deepfake hysteria zoning in on doctored videos of President Donald Trump and Facebook founder Mark Zuckerberg, the original – and most common – use of this sinister form of AI has always been to degrade, humiliate and control women.

You may also like

Amber Heard says she was humiliated by revenge porn in powerful speech

With apps like DeepNude not yet covered by UK revenge porn laws, women will continue to be vulnerable online whether they’ve actually appeared in nude photos and videos or not.  

Because deepfakes are not technically ‘real’ nudes, but rather transposed body parts taken from various sources, they are not yet illegal. But the harm they cause the women they victimise is incalculable.

Sign up for workouts, nutritious recipes and expert tips. You'll also get your Beginner's Guide To Strength Training.

By entering my email I agree to Stylist’s Privacy Policy

Men using tools like the popular FakeApp to create pornographic videos which graft the faces of colleagues, ex-girlfriends and celebrities onto the bodies of porn actors is not new. And the high-profile cases alone are enough to send a chill down your spine. 

Last year, Indian journalist Rana Ayyub was discredited and silenced on a national scale after a deepfake porn video was circulated the day before she was to appear on the country’s main news channel.

The actor Scarlett Johansson has been superimposed into dozens of graphic videos over the past year marked as “leaked” footage and viewed by millions, some of whom were none the wiser.

Just this week, a doctored image featuring US representative Alexandria Ocasio-Cortez was circulated in a Facebook group of US border agents, appearing to show her engaging in oral sex at a migrant facility. 

In May, President Trump – the most powerful man in the world – tweeted a deepfake video of Nancy Pelosi, the speaker of the US House of Representatives. 

The footage showed her drunkenly slurring through a speech, and Trump captioned it: “Pelosi stammers through news conference.” The video was soon revealed to be fake, but as far as Trump’s 61.6 million followers were concerned, the damage had been done.

In every case it’s clear what the aim is – to exercise control over women’s bodies and diminish their power. 

The development of easy-to-use, increasingly convincing deepfake technology is a terrifying nightmare become real, and no one is more at risk than women. The implications for our relationships, careers and mental health is boundless, and the law desperately needs to catch up. 

You may also like

Victims of revenge porn reveal what happened when their naked photos were leaked

A Law Commission review opening this month will look into whether deepfake images should be criminalised and is due to report back in 2021. But the fact that people weaponising AI against women online will not be brought to justice until then, at the very earliest, is a very chilling prospect indeed.

Images: Unsplash

Topics

Share this article

Author

Meena Alexander

Meena Alexander is Stylist’s sub-editor. She mostly writes about music, TV, film and books – all the best things in life.

Recommended by Meena Alexander

Life

We should all celebrate the deletion of ‘Deepfake’ celebrity porn videos

It’s a rare sign of online platforms stepping in to protect women.

Posted by
Moya Crockett
Published
Life

Revenge porn victims could finally get the protection they need under a new law review

It’s currently categorised as a “communications crime” rather than a sexual offence.

Posted by
Hollie Richardson
Published
Life

Bella Thorne has shared nude photos to “take power back” from her revenge porn hacker

They have sparked a huge conversation about cyber-attacks, revenge porn and blackmail.

Posted by
Hollie Richardson
Published