The new Data (Use and Access) Act, which criminalises intimate image abuse, is a huge victory won fast in a space where progress is often glacially slow
For Jodie*, watching the conviction of her best friend, and knowing she helped secure it, felt at first like a kind of victory. It was certainly more than most survivors of deepfake image-based abuse could expect.
They had met as students and bonded over their shared love of music. In the years since graduation, he’d also become her support system, the friend she reached for each time she learned that her images and personal details had been posted online without her consent. Jodie’s pictures, along with her real name and correct bio, were used on many platforms for fake dating profiles, then adverts for sex work, then posted on to Reddit and other online forums with invitations to deepfake them into pornography. The results ended up on porn sites. All this continued for almost two years, until Jodie finally worked out who was doing it — her best friend – identified more of his victims, compiled 60 pages of evidence, and presented it to police. She had to try two police stations, having been told at the first that no crime had been committed. Ultimately he admitted to 15 charges of “sending messages that were grossly offensive or of an indecent, obscene or menacing nature” and received a 20-week prison sentence, suspended for two years.
At that time, there were no laws against deepfake intimate image abuse, although experts had been raising the alarm since 2016. “I felt really lucky to get a conviction, something that will affect the rest of his life,” says Jodie. Her main focus in the months that followed had to be her recovery. “I was in a dark place,” she says. “I’m not ashamed to say I was suicidal. I couldn’t sleep, and when I did, I had nightmares.” More than two years passed before she felt ready to campaign for change, starting by telling her story on BBC Radio 4’s File on 4 in April 2024. “The more I learned, the more I understood and that brought rage,” says Jodie, who works in financial services. “I gradually realised the significance of there being no law which held him accountable for the deepfake abuse – and I wanted change.”
Within little more than a year, she had it. The Data (Use and Access) Act, which received royal assent in June, has made the creation of a deepfake intimate image without consent a criminal offence and also criminalised requesting others to create the deepfake image – as her best friend had when he posted images of Jodie on forums. Both now carry a custodial sentence of up to six months, as well as an unlimited fine. It’s a huge victory won fast in a space where progress has been mind-bendingly slow. However, this isn’t the story of a new government determined to tackle an ever-evolving crime. It was fought for and pushed through by a small group of victims and experts who formed a WhatsApp group called Heroes.
Jodie joined the group last year, shortly after the File on 4 programme aired. She had been invited to the House of Lords to meet Charlotte Owen, Boris Johnson’s controversial appointment, who had taken her seat the previous summer. Lady Owen was the youngest life peer in history when she arrived in the Lords, aged 30, and she was intent on tackling deepfake intimate image abuse. “It was rapidly proliferating, and such a huge threat,” says Owen. “Loads of women my age were very concerned about it, but it wasn’t getting the spotlight.”
Owen organised a core group of experts and campaigners, which included Clare McGlynn, professor of law at Durham University; the Revenge Porn Helpline; Not Your Porn; MyImageMyChoice; and survivors such as Jodie. They formed a tight team in the WhatsApp group Owen set up, aiming to criminalise the creation and requesting of deepfake intimate image abuse. They were braced for a long fight. “In those initial conversations, I was under the impression that it was going to take years,” says Jodie, “so I’ve been kind of gobsmacked by how quickly it happened.”
It isn’t surprising that expectations were low. Since 2015 and the first law to address so-called “revenge porn”, legal protections have been patchy and piecemeal, always months or years behind the latest development in intimate image abuse.

“There has been such a resistance to taking any action in parliament, a complete lack of urgency and absence of interest,” says McGlynn. “Even back in 2015, when they passed the first law against sharing private sexual materials, myself and others were saying that it ought to include altered images because we were already seeing women who had been Photoshopped. The government response was that altered images weren’t sufficiently harmful to be included.”
The protections that were afforded in that early legislation were full of holes. One glaring example was that it made the sharing of intimate images without consent a crime only if it could be proved that this had been done with “intent to cause distress”. Doing it for a laugh, to impress friends, or for sexual gratification wasn’t a crime at all. (After a decade of campaigning, this should finally be rectified in the crime and policing bill, but it still hasn’t quite happened yet.)
By 2019, cases of intimate image abuse reported to police had more than doubled, and cyberflashing and deepfakes had emerged, so the Law Commission was asked to examine whether new laws were required. It took three years – the final report, the Intimate Image Abuse report, was not published until July 2022. “In these years between, we were writing to MPs about what was happening to women and girls and what needed to be done, but the response was always, ‘We just need to wait for the final report’,” says McGlynn. “Five lines added to a piece of legislation could have made a massive difference to millions of women’s lives, but any changes we put forward were rejected.” The report did conclude that the sharing of deepfake abuse should be criminalised – as a result, the Online Safety Act 2023 has made it an offence to share deepfake intimate images without consent. However, criminalising the creation of deepfakes was not recommended. (It acknowledged that “this may be disappointing to those who have been victims of such behaviour”.)
The “Heroes” group set out to change this, and McGlynn at last had some reason to feel hope. They finally had someone on the inside. “Charlotte was right in the middle of the legislative process, and that was essential,” she says. Elena Michael, co-founder of Not Your Porn, which supports victims of intimate image abuse and campaigns for change, agrees. “For a decade, the victims and experts working in the field have been kept in separate rooms to the people making the laws, and the laws we’ve had just haven’t worked,” she says. “When the government doesn’t listen to survivors, they’re not just devaluing their experience, they’re also discarding their expertise. These women have had to fight for years. There’s nothing they don’t know about what change is needed.

“When Charlotte called me up to meet me, I was really excited,” she continues. “From the beginning, she was always about listening and meeting as many victims as she could. She wanted to hear stories and understand every single facet.”
The campaign kicked off in February last year, with Owen’s question in the chamber on deepfake porn and why its harms were not deemed serious enough to criminalise. “It’s a good way of seeing if people care about the issue,” says Owen, “and in the House of Lords, people were shocked. There was so much support.”
The following month, her speech for International Women’s Day again focused on deepfakes. In May, she was drawn from the private members’ bill ballot and set about creating a bill that was essentially written by the women she had gathered together. It criminalised the making of sexually explicit deepfakes and the requesting of them. It was consent-based – if the deepfake was made without consent, it was a crime; the intentions behind it made no difference. It included forced deletion – anyone convicted of creating a deepfake image would be forced to delete it. This was crucial, given one victim’s experience when, after a conviction for intimate image abuse, police returned devices to her perpetrator with the images still on them. “Every single line represented a woman’s real experience,” says Owen.

All those women, including Jodie, were invited to the Lords for briefings and attended all the subsequent debates. “That was the most poignant part,” says Owen. “The Lords were blown away by these brilliant women. When they sat in the gallery, lots of the peers referenced them in their subsequent speeches.” There was strong cross-party support but by its second reading the government indicated it would not support it, as it wanted to address the issue itself. “We were gutted,” says Owen. “But I don’t really take no for an answer.”
Instead, Owen took the contents of the bill and tabled them as amendments to the data bill that was already before the Lords. (“It was the perfect opening,” she says.) The government response was initially an insistence on writing its own amendment instead, but its first attempt did not make the offence consent-based, instead linking the offence to specific intentions. After a lot of argument, including a dossier from McGlynn, it U-turned.
“By now, everything was moving so fast. We were messaging and meeting and updating pretty much all the time,” says Jodie. Every element they were asking for was met with resistance, so required more presentations, new evidence, legal argument, extra briefings.
While the government agreed to criminalise the creation of deepfakes, it initially objected to criminalising the requesting of them. Early drafts also applied a statutory time limit to the offence, which meant it could only be prosecuted within six months of being committed. (Many victims don’t discover the images online until outside that time frame.) Iman Mazhari – an abuse survivor and volunteer with Not Your Porn – spotted this at the 11th hour. Her perpetrator had escaped a harassment charge (but was convicted of others) because it “timed out”, so she is, she says, “obsessed with statutory time limits”. Mazhari emailed McGlynn, who drafted a clause that Owen added to the amendment with days to spare.

Another point of conflict with the government was whether or not the crime should carry a possible prison sentence – the government first opposed this, but Owen believed it was essential as a deterrent. (In her speech, she pointed out that you can go to prison for fly-tipping. Why was the violation of someone’s consent less important?) “Every single word, every line, we were arguing over,” says Owen.
When the bill passed in June, Jodie was watching from the balcony with other survivors of deepfake abuse and McGlynn. “Charlotte left the chamber and sat up there with us,” she says. “It was a gorgeous evening. It felt like we were in a film, and at the end of it, we had a glass of champagne and toasted all the women who had worked for this.
“On the way home, on the tube by myself, the enormity of it really hit me and I just sobbed. We’d fought so hard and achieved so much. Being with these women who were all so passionate, and who just got it – they have healed me. It’s as simple as that.”
It’s not the end, of course. “There’s so much more to do,” says McGlynn. There will certainly be future forms of AI abuse not covered by this legislation. “If I had my way, I’d introduce a general legal provision that would cover all forms of intimate violation and intimate intrusion so we don’t need to start a new campaign every single time the technology moves on and the abuse changes,” say McGlynn. “We also need better processes to get material removed swiftly and easily, and hold non-compliant platforms to account.
“Still, this is significant progress,” she adds. “We’re now able to say, ‘Creating this stuff without consent is wrong.’ It’s unlawful and it’s wrong. The message is clear and that’s great.”
*Not her real name
In the UK and Ireland, Samaritans can be contacted on freephone 116 123, or email [email protected] or [email protected]. In the US, you can call or text the 988 Suicide & Crisis Lifeline at 988 or chat at 988lifeline.org. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org
In the UK, Rape Crisis offers support for rape and sexual abuse on 0808 802 9999 in England and Wales, 0808 801 0302 in Scotland, or 0800 0246 991 in Northern Ireland. In the US, Rainn offers support on 800-656-4673. In Australia, support is available at 1800Respect (1800 737 732). Other international helplines can be found at ibiblio.org/rcip/internl.html

1 hour ago
2

















































