Instagram and Facebook found in breach of EU law over flagging of illegal content

3 hours ago 3

Instagram, Facebook have breached EU law by failing to provide users with simple ways to complain or flag illegal content including child sexual abuse material and terrorist content, the European Commission has said.

In a preliminary finding on Friday, the EU’s executive body said Meta, the $1.8tn California company that runs the social media services, had introduced unnecessary steps in processes for users to submit reports.

It said both platforms appeared to use deceptive design – known as “dark patterns” – in the reporting mechanism in a way that could be “confusing and dissuading” to users.

The commission found this amounted to a breach of the company’s obligations under the EU-wide Digital Services Act (DSA), and meant that “Meta’s mechanisms to flag and remove illegal content may be ineffective”. Meta denies it has breached the act.

“When it comes to Meta, neither Facebook nor Instagram appear to provide a user-friendly and easily accessible ‘notice and action’ mechanism for users to flag illegal content such as child sexual abuse material and terrorist content,” the commission said.

A senior EU official said the case was not just about illegal content, but also about freedom of speech and “moderation that has gone too far”. In the past, Facebook has been accused of “shadow banning” users on issues like Palestine, meaning their content is demoted by the algorithm.

The current mechanisms for complaints were “too difficult for users to go through to the end”, resulting not just in ineffectiveness but a disincentive for users to get in touch, the official said.

Campaigners have continued to allege safety shortcomings in some of Meta’s products. Last month a Meta whistleblower, Arturo Béjar, published research that he said showed that the majority of new safety tools rolled out on Instagram were ineffective, leaving children under 13 not safe on Instagram.

Meta rejected the report’s findings and said parents had robust tools at their fingertips. The company introduced mandatory teen accounts on Instagram in September 2024, and this month it said it would adopt a version of the PG-13 cinema rating system to give parents stronger controls over their teenagers’ use of the social media platform.

The commission also said Meta made things difficult for users whose content had been blocked or their accounts suspended. It found the decision appeal mechanism did not appear to allow users to provide explanations or evidence to substantiate their appeals, limiting its effectiveness.

The commission said simplification of the feedback system would also help the platforms eliminate fake news such as the deepfake video in Ireland claiming the leading presidential election candidate, Catherine Connolly, was pulling out of Friday’s election.

The investigation, which is ongoing, was carried out in cooperation with Coimisiún na Meán, the Irish digital services coordinator responsible for regulating the platforms, whose EU headquarters are in Dublin.

The commission also made a preliminary finding that TikTok and Meta were in breach of their obligation to grant researchers adequate access to public data that could be used to check on how far minors are exposed to illegal or harmful content. It said researchers were often left with partial or unreliable data.

skip past newsletter promotion

“Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health,” the commission said.

The preliminary findings allow the platforms time to comply with the commission’s demands. If they do not, they face a fine of up to 6% of total worldwide annual turnover, with periodic penalty payments to compel compliance.

Henna Virkkunen, the commission’s executive vice-president for tech sovereignty, security and democracy, said: “Our democracies depend on trust. That means platforms must empower users, respect their rights and open their systems to scrutiny.

“The DSA makes this a duty, not a choice. With today’s actions, we have now issued preliminary findings on researchers’ access to data to four platforms. We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society.”

A Meta spokesperson said: “We disagree with any suggestion that we have breached the DSA, and we continue to negotiate with the European Commission on these matters. In the European Union, we have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law in the EU.”

TikTok has been approached for comment.

Read Entire Article
International | Politik|