N/A
DeepfakesBBC Three

Are deepfakes the new revenge porn?

New face-swap technology means this is a growing issue

Gemma Askham

Guidance: contains adult themes

‘Emma Watson’ is naked, kneeling on a white sofa. Nearby, ‘Maisie Williams’ is sat legs akimbo, masturbating. ‘Gal Gadot’ is riding cowgirl. ‘Jennifer Lawrence’ is doing it doggy style.

Where are we? Not in the murky depths of a horny teenager’s fantasy, but on the site PornHub – where the only differences between the Hollywood film star Emma Watson and the fake porn star ‘Emma Watson’ are a few pixels, several layers of clothing, and a total lack of consent. Welcome to the world of deepfakes: a new kind of X-rated identity theft.

Deepfakes are porn videos doctored using AI face-swap technology, so the adult performer’s face is replaced with somebody else’s. Only six months ago, making sexy switcheroos required mind-numbingly complex coding and a hell of a lot of free time.

But, now, a ‘deepfake porn’ internet search brings upare almost 700,000 results. The tech has exploded so quickly – and become so easy to use – that pretty much anyone with the urge can build face-customised porn in around 12 hours. That’s less time than it takes to arrange a real-life hook-up on a dating app.

Celebrities are the most popular victims. This kind of face-fakery actually began in Hollywood. “The technology to superimpose celebrity faces onto other people’s has been available to movie special effects departments for years,” says Evgeny Chereshnev, CEO of security technology company BiolinkTech.

Emma Watson has been targeted ...Getty/BBC Three
Emma Watson has been targeted by deepfakes

Advanced visual effects (VFX) are what enabled new scenes with Paul Walker to be ‘filmed’ for Fast and Furious 7, after he had died. The tech was also used to resurrect the late Peter Cushing as Grand Moff Tarkin in Rogue One.

Doctored photos aren’t anything new, but the availability and level of realism of modern deepfakes is.

Deepfakes got their name from a Reddit user. Last year, ‘deepfakes’ perfected a complex algorithm that created creepy videos appearing to feature Gal Gadot, Taylor Swift and Scarlett Johansson performing pornographic acts. Even if your brain knows that Taylor couldn’t possibly be the person doing that, your eyes are pretty convinced.

Then, in January, another Redditor created a free app called FakeApp with an in-built algorithm to do the face-swapping for you. You'd need a high-powered computer (the kind a filmmaker would have), a graphics processing unit (GPU) and enough images of your target – which, due to social media, YouTube and even LinkedIn, isn’t that hard – but, theoretically, anyone could turn a porn star into someone else.

Now, basic versions are available to ordinary people: for £20 a month, Adobe can supply the tools to create a digital copy of someone – though expensive professional software would be needed to take it to the next level. Adobe VoCo, a kind of photoshop for audio, even enables the user to recreate someone’s voice after only 20 minutes of listening to it. It’s still in the research stages, but other companies like Lyre Bird have more basic software already available to use.

People can now replace porn actors' faces with those of their long-term crush, say. Or maybe they want vengeance on an ex, and they do it to sabotage the ex’s career or new relationship. Or, indeed, to sabotage the career or relationship of anyone they’re angry with.

Celebrities have an army of la...Getty/BBC Three
Celebrities have an army of lawyers to tackle issues like this

Suddenly, it isn’t just celebrities (with their armies of powerful lawyers) who could find ‘themselves’ on people’s laptops. It’s you.

The lawyer Ann Olivarius, who has been working with victims of revenge porn since it was made a criminal offence in 2015, says she has received calls from clients saying they have been victims of deepfaking.

"It's a great concern to us," she says. "It's really devastating for these women because with new technology, it can look like the real thing. The intent behind it is always to hurt and to degrade."

She believes that deepfaking is part of the same problem as revenge porn. "There are so many different types of revenge porn out there," she says. "It's a growing problem, and it keeps manifesting itself in new ways."

While celebrities can call on expensive lawyers, and can potentially use defamation law to prosecute deepfakers (providing they can prove the image has caused, or is likely to cause, serious harm to their reputation), it can be a lot harder for ordinary people to take action.

In January, an Australian man was sentenced to 12 months in jail after Photoshopping his teenage stepdaughter’s face onto women engaged in sex acts, including bestiality, and there have been other similar cases in the UK.

But, says Luke Patel, a specialist in privacy law at Blacks solicitors, “The influx of fast-paced developments in technology makes it very difficult for laws to keep up and adequately support victims.”

The law currently makes no explicit reference to deepfakes though on 25 May, the General Data Protection Regulation (GDPR) will be implemented. It includes two new tools under the ‘Right of Erasure’ and the ‘Right to Be Forgotten’ which Luke believes could help "enable an individual to request the permanent removal or deletion of their personal data (including images) when there is no good reason for its continued publication” - though each case will be decided on an individual basis.

“It’s not an absolute right, but the case is stronger if the image is unwarranted and causes substantial distress. Although,” he continues, “they are still only tools that can be deployed when damage has already occurred.” They won’t stop it from happening in the first place.

Deepfaking could become the ne...Getty/BBC Three
Deepfaking could become the new revenge porn

If internet platforms stopped hosting deepfakes, that could stem the rising tide. Reddit has banned deepfakes, calling them an unacceptable form of “involuntary pornography”.

Pornhub claimed in February to be following suit, but if you search for ‘porn’ and ‘deepfakes’ the top results are all on Pornhub. BBC Three contacted Pornhub for an explanation, and received this statement from its VP, Corey Price: “Nonconsensual content is not permitted on our platform as it violates our Terms of Service.”

I flagged at least 20 deepfake videos on the site, including those featuring celebrities - which we could safely assume were not consensual. Pornhub responded that they have a Content Removal Page, where people can “request the removal of nonconsensual material. As soon as they make a request, we work to promptly remove the content. We also rely on our community and/or content owners to flag inappropriate content.”

Creating deepfakes is becoming so easy it could become a party game: bring photos, booze, and, instead of watching YouTube videos, sit around and create stolen-identity porn.

“In a couple of years, you could be able to go to a porn app store and buy VR sex experiences with anyone you want,” says Evgeny Chereshnev, of BiolinkTech.

You can already buy incredibly realistic sex dolls. “Soon,” says Evgeny, “technology could allow someone to steal your identity and order a sex doll with your face on it.”

The thought of a future where ‘you’ could exist on someone’s laptop, or where images of ‘you’ could be created solely to be maliciously circulated - or where 'you' could even be sitting in sex doll form in the corner of someone’s bedroom - is deeply disturbing. It may already be here.

This article was originally published on 24 April 2018.

Read more: