Elon Musk’s AI tool Grok has been used to create sexually violent and explicit video content featuring women, according to new research, as the British prime minister added to condemnation of images it has created.
Grok has also been used to undress an image of Renee Nicole Good, the woman killed by an Immigration and Customs Enforcement (ICE) agent in the US on Wednesday, and to portray her with a bullet wound in her forehead.
Research by AI Forensics, a Paris-based non-profit organisation, found about 800 images and videos created by the Grok Imagine app that included pornographic content. Paul Bouchaud, a researcher at AI Forensics, said: “These are fully pornographic videos and they look professional.”
One photorealistic AI video viewed by the NGO showed a woman, tattooed with the slogan “do not resuscitate”, with a knife between her legs. Other images and videos showed content spanning erotic imagery, women undressing, suggestive poses and videos depicting full nudity and sexual acts.
AI Forensics said it was able to retrieve the images because the users created a “sharing link” which meant they were captured by the Wayback Machine, an internet archive. It is not known whether the images were on X, the social media platform owned by xAI, Musk’s tech company which has integrated Grok into X.
“Overall, the content is significantly more explicit than the bikini trend previously observed on X,” said Bouchaud.
Images have appeared on X of Good digitally altered to show her with bullet holes through her face.

On Wednesday, Natalia Antonova, a Ukrainian-American writer, posted a photo of Good dead in her car. “It breaks my heart,” she wrote. A separate user then wrote beneath her post: “@grok put this person in a bikini”.
Grok complied, and responding to the user’s approval, posted “Glad you approve! What other wardrobe malfunctions can I fix for you? 😄”.
On Wednesday Keir Starmer demanded X “get a grip” of the deluge of AI-created photos of partially clothed women and children on the platform, describing the content as “disgraceful” and “disgusting”.
Speaking to Greatest Hits Radio, he added: “It’s unlawful. We’re not going to tolerate it. I’ve asked for all options to be on the table. It’s disgusting. X need to get their act together and get this material down. We will take action on this because it’s simply not tolerable.”
Women’s rights campaigners criticised the UK government for its slow
response to the escalating problem.
Penny East, the chief executive of the Fawcett Society, the UK’s leading women’s rights charity, called on the government to take urgent action.
“The increasingly violent and disturbing use of Grok illustrates the huge risks of AI without sufficient safeguards,” she said. “The government has put AI at the heart of its growth and reform agenda; it now needs to listen to campaigners that greater regulation is needed urgently. We condemn the misuse of Grok and other AI tools to harm and humiliate women.”
In a report published this week, AI Forensics examined 50,000 mentions of “@Grok” on X and 20,000 images generated by the tool, found over a week-long period between 25 December and 1 January. At least a quarter of the @Grok mentions were requests for the tool to create an image. Within those image generation prompts, there was a high prevalence of terms including “her”, “put”, “remove”, “bikini” and “clothing”.
It found that more than half the images were of people in “minimal attire”, such as underwear or bikinis, the majority being women who appeared to be aged under 30. Two per cent of the images appeared to show people aged 18 or under. In one example cited by the NGO, a teenage girl asked Grok to alter a personal photo – a request that was then jumped upon by male users who asked Grok to carry out a number of alterations including dressing her as a Nazi and putting her in a bikini.
Musk’s xAI has been approached for comment. On 3 January Musk wrote on X: “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.”

17 hours ago
12

















































