According to legal experts, Channel 4 may have violated the Sexual Offences Act 2003 with their recent documentary about nonconsensual, AI-generated pornography, Vicky Pattison: My Deepfake Sex Tape.
During the programme, they show what appears to be deepfaked footage of Scarlett Johansson in lingerie. She lies back on a bed strewn with underwear and heart-shaped petals while a voiceover announces: “Over the past decade there has been a trend of using AI to create videos of celebrities like Scarlett Johansson and Margot Robbie in highly explicit and degrading scenarios.”
Lawyers specialising in the field of sexually explicit AI-generated imagery have suggested that this might mean that Channel 4 is in violation of the act, as it specifically forbids nonconsensually sharing computer-generated imagery which appears to show someone in underwear.
“It could, I think, breach the Sexual Offences Act if this were a deepfaked image of someone taken without their consent,” says Clare McGlynn, a professor of law, who specialises in the legal regulation of pornography, sexual violence and online abuse. “Because it is an image that falls within that act.”
The decision to include AI-generated intimate imagery of Johansson in particular has been hailed as particularly poor taste given how vocal she has been in opposing the sharing of deepfaked imagery of women.
She has previously called it “demeaning” and talked of her frustration that “nothing can stop someone from cutting and pasting my image or anyone else’s on to a different body and making it look as eerily realistic as desired”.
“Scarlett Johansson was one of the very first celebrities to ever be subject to deepfake sexual abuse,” says McGlynn.
“She has had to live with this since 2017/18, so if it’s a deepfake without her consent, then I think showing it isn’t the best editorial decision, because you risk reproducing the harm and humiliation she has long experienced.”
Some legal experts have pointed out that while Channel 4 may have broken the law with the image of Johansson, they would most likely be able to offer the defence that they had a “reasonable excuse” for doing so, given that it came as part of a documentary that they have previously claimed was raising awareness of deepfake pornography.
“That would be left up to the courts to decide,” says Alice Trotter, an associate lawyer at Kingsley Napley, who has written about the regulation of sexually explicit deepfakes.
“There are also exemptions … one of which is if the person who shared the image reasonably believed it had previously been publicly shared, and the individual in the picture had consented to the previous sharing, or it was reasonably believed that they had.”
This is the latest problem faced by Channel 4’s documentary, after campaign groups representing survivors of deepfake abuse criticised its decision to release an AI-generated pornographic video of host Vicky Pattison, specifically warning that it could increase traffic to the sites they were fighting against.
“Using that image shows a real lack of understanding about the issue and the real life harm that it causes,” said one campaigner, who requested to remain anonymous, having also had pornographic AI-generated images that depicted them shared online.
“I understand that Channel 4 might argue that this was done to raise awareness, but there are ways to raise awareness without replicating the harm that we’re fighting against. It’s not just about nudity, and that is laid out in the Sexual Offences Act. It’s about the violation of someone’s identity and autonomy, and their consent. It all comes back to consent.”
Lawyers have also raised concerns that using a potentially illegal image undermines efforts to change the law around creating deepfake imagery. McGlynn appeared in the Channel 4 documentary (without full knowledge of how it would be created) to urge the need for robust legislation that means forcing tech giants to act.
“I think the thrust of the programme – trying to shine a light on this issue – was really positive,” she says. “My biggest regret with how Channel 4 have gone about it is that this [the ethics/legality of using deepfake imagery of a celebrity] is the nature of the conversation we are having, rather than the documentary having sparked a bigger discussion.
“It’s really regrettable. We should be focusing on issues such as how the tech giants are involved – about what Google and Instagram are doing.”
A Channel 4 spokesperson said: “Celebrities worldwide have been impacted by the rise in deepfake pornography. Channel 4 took steps to ensure we only identified celebrities who have been widely reported as being victims, and we blurred any sexually explicit content. As with all our programmes, we have ensured that we have adhered to all relevant laws and regulations.”
Scarlett Johansson was contacted for comment.