What are the Ofcom measures to protect children online – and will they work?

4 hours ago 3

The UK communications watchdog has set out more than 40 measures to keep children safe online under a landmark piece of legislation.

The Online Safety Act has a strong focus on protecting under-18s from harmful content and the codes of practice published by Ofcom on Thursday are a significant moment for regulation of the internet.


What are the measures set out by Ofcom?

The measures, which apply to sites and apps, video platforms such as YouTube and search engines, include: social media algorithms, which push content towards users, must filter out harmful content from children’s feeds; risky services, which will include major social media platforms, must have “effective” age checks so they can identify those under 18 and shield them from harmful content (or make the entire site safe for children); sites and apps must “quickly tackle” harmful content; children must have a “straightforward” way to lodge complaints and report content; all services must have a named executive responsible for children’s safety.

Broadly, the act requires sites and apps likely to be accessed by children to suppress the spread of harmful content, such as violent, hateful or abusive material and online bullying. There are other categories of content that need to be kept off children’s feeds altogether such as pornography and material related to self-harm, suicide and eating disorders.


What happens if companies don’t follow the measures?

From 25 July, sites and apps covered by the codes need to implement those changes – or use other “effective measures” – or risk being found guilty of breaching the act.

If companies fail to comply with the requirement to protect children from harmful content, Ofcom can impose fines of up to £18m or 10% of global revenue. In the case of a company such as the Facebook parent, Meta, that would equate to a fine of $16.5bn (£12.4bn). For extreme breaches, Ofcom can ask a court to prevent the site or app from being available in the UK.

Senior managers at tech firms will also be criminally liable for repeated breaches of their duty of care to children and could face up to two years in jail if they ignore enforcement notices from Ofcom.


Do the codes tackle online misogyny and toxic male influencers?

The Netflix series Adolescence has enhanced the scrutiny of online misogyny and the reach of misogynist influencers such as Andrew Tate. Ofcom says the codes tackle online misogyny by requiring platforms to ensure their algorithms downplay content that, for instance, demeans women or promotes the idea that men are superior to women.

“We expect companies to not be pushing this type of content,” says Almudena Lara, an online safety policy director at Ofcom.

Ofcom’s guidance on content harmful to children, which they must be protected from encountering, includes a “hateful or aggressive misogynistic comment targeting a woman or girl” and a “post or comment attacking someone based on their gender using offensive, demeaning language to describe them”.


Will the proposals work?

Before the Online Safety Act, there was no all-encompassing legislation addressed toward social media platforms and search engines. So the threat of fines and a clear instruction to protect children from harmful content should have an impact. Regulation of the online space is no longer a loose amalgam of existing laws (such as legislation covering malicious communications) and self-regulation.


What do critics of the measures say?

The Molly Rose Foundation, established by the family of Molly Russell, a British teenager who took her own life after viewing harmful online content, believes the measures do not go far enough in various areas including: stopping dangerous online challenges; focusing on recent moderation changes at Meta and Instagram; and does not include harm reduction targets.

The NSPCC, a child protection charity, wants tougher measures on strongly encrypted messaging services such as WhatsApp – an ongoing issue for safety campaigners – although it describes the measures overall as a “major step forward”.


Will the measures come under pressure from the US?

The Online Safety Act has been highlighted as a potential bargaining chip in US-UK trade talks, with a report this month claiming that a draft transatlantic trade agreement contains commitments to review enforcement of the legislation. However, the report from online newsletter Playbook said the review would not be a “do-over” of the act. The Guardian has also reported that the US state department has challenged Ofcom over the act’s impact on freedom of expression.

The science, innovation and technology secretary, Peter Kyle, has taken a firm stance on amending the act. Speaking on BBC Radio 5 Live on Thursday, he said US tech firms “must adhere to British laws” if they are to operate in the UK. He said Silicon Valley bosses such as Elon Musk and Mark Zuckerberg must “adapt to the different territories they have access to”.

Kyle has also made clear that protection of children was a red line, saying last month that “none of our protections for children and vulnerable people are up for negotiation”.

Read Entire Article
International | Politik|