'Deepfake porn images still give me nightmares'

By Sara Royle
BBC Radio 5 Live

  • Published
Helen MortImage source, Emma Ledwith
Image caption,
Helen Mort was told no action could be taken over the deepfake porn images

A woman who has been the victim of deepfake pornography is calling for a change in the law.

Last year, Helen Mort discovered that non-sexual images of her had been uploaded to a porn website.

Users of the site were invited to edit the photos, merging Helen's face with explicit and violent sexual images.

Speaking to BBC Radio 5 Live's Mobeen Azhar, Helen said she wanted to see the creation and distribution of these images made an offence.

"This is a crime which in many cases is going on invisibly," Helen said. "Those images of me had been out there for years and I didn't know about them, and I'm still having nightmares about some of them now. It's an incredibly serious form of abuse."

Deepfakes are realistic computer-generated images or video, based on a real person.

Media caption,
Actress Bella Thorne opens up about her experience of deepfake abuse

Helen, a poet and writer from Sheffield, was alerted to the deepfake images by an acquaintance.

The original images were taken from her social media and included holiday pictures and photos from her pregnancy.

She said although some of the images were clearly manipulated, there were a few more "chilling" examples that were a "lot more plausible'.

"You go through different phases with things like this," she said. "There was one point where I was just trying to laugh about the almost ridiculous nature of some of it.

"But obviously, the underlying feeling was shock and actually I initially felt quite ashamed, as if I'd done something wrong. That was quite a difficult thing to overcome. And then for a while I got incredibly anxious about even leaving the house."

She alerted the police to the images but was told that no action could be taken.

The law

Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.

"In England and Wales, under section 33 of the Criminal Justice and Courts Act 2015, it is an offence to non-consensually distribute a private sexual photograph or film with the intent to cause distress to the person depicted," she said.

"But this only applies where the original photo or video was private and sexual.

"In Helen's situation, where non-sexual photos were merged with sexual photos, this isn't covered by the criminal offence.

"Furthermore, as the photos were not shared with Helen directly, nor did the intention seem to be to cause distress to Helen, the second element is not fulfilled - even though it did, evidently, cause distress. The other potential criminal offence would be harassment, but given the perpetrator here did not direct it at Helen herself, this didn't apply either."

Media caption,
Deepfake videos: Can you really believe what you see?

The independent Law Commission is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later this year.

However, Dr O'Connell said the process of changing the law would take years which she says is "too long".

Helen hopes to use her experience to raise awareness around deepfake pornography and has launched a petition calling for a change in the law.

It has received more than 3,400 signatures.

She has also written a poem in response to the images.

"I'm a writer by trade," she said. "And I thought the only thing that is going to allow me to reclaim any sense of agency here is to say something about it using my art form. That's the only power that I have.

"The intention of this person, as they said in their post, was to humiliate. They said they wanted to see this person humiliated, and I thought well actually I'm not humiliated, and I'm going to speak out about it because I shouldn't be the one who feels ashamed."

The Home Office said it was taking steps to tackle new and emerging forms of violence against women and girls, including intimate image abuse, "whether this be cyber flashing, revenge porn or deep fake videos."

"We are currently consulting on the development of our new strategy to tackle violence against women and girls and we encourage people to give their views," a spokesperson said.

"This new strategy will ensure victims and survivors are supported, and that perpetrators are identified and brought to justice."