Awareness of the harm caused by online pornography is rising. Last month, the government bowed to pressure from campaigners and pledged to make depictions of strangulation illegal. Research showing that a majority of children have viewed this kind of material is extremely disturbing, all the more so given evidence that viewing “choking” makes people – mostly men – more likely to do it in real life. This week, the Guardian examined the distressing effects of deepfake pornography in schools, and interviewed the women behind the successful campaign to criminalise the nonconsensual creation of deepfake intimate images.
Ofcom’s announcement that it has issued a £1m fine to a Belize-based pornography company, AVS Group, thus seems timely. Oliver Griffiths, the regulator’s director of online safety, referred on BBC radio to a “tide turning” as enforcement powers in the online safety bill take effect. The age-verification checks on AVS’s websites, introduced to protect children, are judged not to be effective enough. If the company does not pay up, Mr Griffiths said that he would move to block the site.
One thing everyone connected with online regulation agrees on is that technology is moving at an incredible rate. The danger is that societies, and the systems we use to manage risks, can’t keep up. While it is good to see Ofcom taking action, it is alarming that 90 other companies – 83 of which run pornography sites – are also being investigated, with further fines said to be imminent. Liz Kendall, the technology secretary, warned last month that the regulator was in danger of losing public trust if it did not up the pace of implementation of the Online Safety Act and get on top of emerging threats.
Regulation and enforcement in the online space is complicated. When Ofcom issued a £20,000 fine to the controversial forum 4chan, the companies behind it and another forum, Kiwi Farms, filed a legal case. They want a US court to rule that the UK’s online safety laws and codes of conduct do not apply to them.
Such challenges make Ofcom’s task harder. But it must not be cowed. Campaigners such as Ian Russell, whose daughter Molly took her own life after viewing harmful material, are right to highlight the moral imperative to make the internet safer for children. They want a new “duty of candour” for public officials to apply to tech companies too, and for the regulator to be more proactive and less passive.
This is all made even more urgent by emerging concerns about agentic AI, or chatbots, which stand accused of acting as “suicide coaches” in several US lawsuits. If Ms Kendall believes that there are gaps in current online safety laws, as she said on Wednesday, she must close them. Beeban Kidron, the crossbench peer and online safety campaigner, has laid amendments to the government’s crime and policing bill that would achieve this. Ministers should not drag their heels, as happened with deepfake images.
Online safety is not the only area in which Ofcom is accused of damaging inaction. It has also appeared reluctant to tackle racism and misinformation about climate policy on GB News. Regulatory oversight of news and other media have always been important. But in our age of pocket computers, shape-shifting tech and political polarisation, questions about Ofcom’s performance – particularly in relation to children – have arguably never been more pressing.
-
Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

1 hour ago
1

















































