Artificial intelligence “nudification” apps that create deepfake sexual images of children should be immediately banned, amid growing fears among teenage girls that they could fall victim, the children’s commissioner for England is warning.
Girls said they were stopping posting images of themselves on social media out of a fear that generative AI tools could be used to digitally remove their clothes or sexualise them, according to the commissioner’s report on the tools, drawing on children’s experiences. Although it is illegal to create or share a sexually explicit image of a child, the technology enabling them remains legal, the report noted.
“Children have told me they are frightened by the very idea of this technology even being available, let alone used. They fear that anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps,” the commissioner, Dame Rachel de Souza, said.
“The online world is revolutionary and quickly evolving, but there is no positive reason for these particular apps to exist. They have no place in our society. Tools using deepfake technology to create naked images of children should not be legal and I’m calling on the government to take decisive action to ban them, instead of allowing them to go unchecked with extreme real-world consequences.”
De Souza urged the government to introduce an AI bill that would require developers of GenAI tools to address the risks their products pose, and to roll out effective systems to remove sexually explicit deepfake images of children. This should be underpinned by policymaking that recognises deepfake sexual abuse as a form of violence against women and girls, she suggested.
In the meantime, the report urges Ofcom to ensure that age verification on nudification apps is properly enforced and that social media platforms prevent sexually explicit deepfake tools being promoted to children, in line with the Online Safety Act.
The report cited a 2025 survey by Girlguiding, which found that 26% of respondents aged 13 to 18 had seen a sexually explicit deepfake image of a celebrity, a friend, a teacher, or themselves.
Many AI tools appear to only work on female bodies, which the report warned is fuelling a growing culture of misogyny.
One 18-year-old girl told the commissioner: “The narrative of Andrew Tate and influencers like that … backed by a quite violent and becoming more influential porn industry is making it seem that AI is something that you can use so that you can always pressure people into going out with you or doing sexual acts with you.”
The report noted that there is a link between deepfake abuse and suicidal ideation and PTSD, for example in the case of Mia Janin, who died by suicide in March 2021.
De Souza wrote in the report that the new technology “confronts children with concepts they cannot yet understand”, and is changing “at such scale and speed that it can be overwhelming to try and get a grip on the danger they present”.
Lawyers told the Guardian that they were seeing this reflected in an increase in cases of teenage boys getting arrested for sexual offences because they did not understand the consequences of what they were doing, for example experimenting with deepfakes, being in a WhatsApp chat where explicit images are circulating, or looking up porn featuring children their own age.
Danielle Reece-Greenhalgh, a partner at the law firm Corker Binning who specialises in sexual offences and possession of indecent images, said the law was “trying to keep up with the explosion in accessible deepfake technology”, which was already posing “a huge problem for law enforcement trying to identify and protect victims of abuse”.
She noted that app bans were “likely to stir up debate around internet freedoms”, and could have a “disproportionate impact on young men” who were playing around with AI software unaware of the consequences.
Reece-Greenhalgh said that although the criminal justice system tried to take a “commonsense view and avoid criminalising young people for crimes that resemble normal teenage behaviour … that might previously have happened behind a bike shed”, arrests could be traumatic experiences and have consequences at school or in the community, as well as longer-term repercussions such as needing to be declared on an Esta form to enter the US or showing up on an advanced DBS check.
Matt Hardcastle, a partner at Kingsley Napley, said there was a “minefield for young people online” around accessing unlawful sexual and violent content. He said many parents were unaware how easy it was for children to “access things that take them into a dark place quickly”, for example nudification apps.
“They’re looking at it through the eyes of a child. They’re not able to see that what they’re doing is potentially illegal, as well as quite harmful to you and other people as well,” he said. “Children’s brains are still developing. They have a completely different approach to risk-taking.”
Marcus Johnstone, a criminal solicitor specialising in sexual offences, said he was working with an “ever-increasing number of young people” who were drawn into these crimes. “Often parents had no idea what was going on. They’re usually young men, very rarely young females, locked away in their bedrooms and their parents think they’re gaming,” he said. “These offences didn’t exist before the internet, now most sex crimes are committed online. It’s created a forum for children to become criminals.”
A government spokesperson said: “Creating, possessing or distributing child sexual abuse material, including AI-generated images, is abhorrent and illegal. Under the Online Safety Act platforms of all sizes now have to remove this kind of content, or they could face significant fines.
“The UK is the first country in the world to introduce further AI child sexual abuse offences, making it illegal to possess, create or distribute AI tools designed to generate heinous child sexual abuse material.”
-
In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In the US, call or text the Childhelp abuse hotline on 800-422-4453. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International