“I was porn deepfaked by my best friend – my whole world collapsed around me”

A woman who was deepfaked by her best friend and found fake ‘porn’ videos of herself having sex online is campaigning to see it made illegal. In 2019, ‘Jodie’ – who goes by pseudonym – from Cambridgeshire – began seeing her social media pictures pop up on dating websites including OK Cupid and Happn, but tried to ignore it.

Then they appeared on Twitter[1] alongside messages soliciting sex – before hundreds appeared on websites where men post ex-partners’ pics and encourage trolling. She went to police but says officers couldn’t do anything because Jodie didn’t know who was sharing them.

Then an anonymous email in March 2021 alerted her to a non-mainstream porn website where she found ‘deepfaked’ images and videos of herself – performing sexual acts. What this means is people had taken ordinary, fully clothed photos of ‘Jodie’ that she had shared on social media, and manipulated them using AI into pornographic sexually explicit images and videos.

Traumatised ‘Jodie’ confided in her family, friends, and boyfriend – and felt it was “the ultimate violation”. She eventually worked out who it was – her own best friend, Alex Woolf – after spotting a photo shared that only he had access to.

‘Jodie’ reported it to the Met Police but the creation, distribution and solicitation of deepfake images wasn’t – and still isn’t – considered to be a crime. In August 2021, Woolf admitted 15 charges of sending messages that were grossly offensive or of an indecent, obscene or menacing nature over a public electronic communications network.

The derogatory comments accompanied pictures he uploaded to pornographic websites of the women, including ‘Jodie’, which had been taken from social media. None of the pictures were pornographic or indecent, but he asked users to photoshop his victims’ heads onto pornographic actresses’ bodies, which were then posted on adult websites.

No caption

Only ‘Jodie’s’ images were deepfaked, while others were normal images shared alongside grossly offensive language – and it was the language, not the deepfaked pictures of ‘Jodie’, which led to his conviction. Woolf was given a 20-week prison sentence, suspended for two years.

Despite soliciting deepfaked sexual images of ‘Jodie’, made by other users online, he was not charged for that because it is not considered to be a crime. Jodie, 26, from Cambridgeshire[4], said: “When I saw the AI-generated pictures and videos, I was terrified.”

“There were nine or ten pictures and videos of me being what I can only describe as raped, and a****y penetrated. There was one with a schoolgirl’s body with my face on it, in a student-teacher relationship. It felt like the whole world collapsed around me. To take my photo out of context and have it used like that – I think it’s everyone’s worst nightmare.”

‘Ultimate violation’

“It was the ultimate violation. In my victim impact statement I told how it made me feel suicidal and it has made it difficult for me to trust anyone again. He was cowering in the corner when he was sentenced and he couldn’t even look at me when I spoke to him.”

In April this year, it was announced a new law would be introduced to crack down on deepfake image abuse. But then the Conservatives[5] were voted out of government, leaving ‘Jodie’ and other victims questioning the future of the planned bill.

Clare McGlynn, Professor of Law at Durham University, who supports the campaign, explained the current legal standpoint. She said: “The current law only makes it illegal to distribute or threaten to distribute intimate photos or videos of someone, including deepfake images, without their consent.”

“A vital creation offence was announced in April 2024 under the previous government, which aimed to criminalise the act of making these images in the first place, though it would have only covered certain cases of creation. However, when the general election was called, that commitment fell with the Criminal Justice Bill. So far, the new Labour government has not made any commitment to reintroduce a creation offence, leaving a critical loophole in place.”

‘Jodie’ feels deepfake is “the next iteration of violence against women and girls.” She thinks the current situation allows “loopholes whereby perpetrators can get away with crimes without facing real repercussions or rehabilitation.”

She is now campaigning for harsher penalties for people who solicit and distribute deepfakes, as well as criminalising people who create them. In September, ‘Jodie’ launched a Change.org petition for a campaign in partnership with The End Violence Against Women Coalition (EVAW), #NotYourPorn, Professor Clare McGlynn, and Glamour UK.

‘Violating and humiliating’

The petition reads:[6] “For too long the government’s approach to tackling image-based abuse has been piecemeal and ineffective. This crisis demands more.”

With more than 60,000 signatures, the petition has been directed towards Peter Kyle, Secretary of State for Science, Innovation and Technology, and the PM Keir Starmer. Reflecting on her experiences, Jodie said: “I’m still now full of rage. I try to channel it into raising awareness for other women.

“There’s a misconception that because it’s online, it’s not real. But for victims, knowing that people can’t tell these images are fake feels just as violating and humiliating as if they were genuine.”

Professor McGlynn added: “We’re calling for a comprehensive Image-Based Abuse law that doesn’t just address distribution but also the creation and solicitation of these images. Such a law would give victims a path to justice and ensure perpetrators face not only consequences but also the rehabilitation and education needed to break this cycle of abuse. Robust protection like this would finally close these harmful gaps in the law.”

References

  1. ^ Twitter (www.cambridge-news.co.uk)
  2. ^ A1 remains shut in Cambridgeshire for emergency repairs as crash sees lorry end up in field (www.cambridge-news.co.uk)
  3. ^ Passengers face disruption and diversion on trains from Cambridge (www.cambridge-news.co.uk)
  4. ^ Cambridgeshire (www.cambridge-news.co.uk)
  5. ^ Conservatives (www.cambridge-news.co.uk)
  6. ^ The petition reads: (www.change.org)