European

EU Policy. Facebook and Instagram probed over disinformation handling

FALCON POWERS – The European Commission says the platforms’ tools to protect the EU elections are not sufficient.

Meta’s Facebook and Instagram are subject to an investigation over how the platforms tackle disinformation under the Digital Services Act (DSA), amid fears they are vulnerable to Russian networks, the European Commission said today (30 April).

The suspected infringements cover Meta’s practices relating to deceptive advertising and political content on its services, the Commission said.

According to the EU executive, the company’s advertising network is vulnerable to misinformation, and potentially a target for Russian networks.

The platforms also lack an effective third-party real-time election-monitoring tool since Meta did not replace its public insights tool CrowdTangle, which makes it difficult for researchers and journalists to know what efforts the company makes to take down illegal content.

“Given the reach of Meta’s platforms in the EU – accounting for over 250 million monthly active users – and in the wake of the European elections, and a series of other elections to take place in various Member States, such deprecation could result in damage to civic discourse and electoral processes in relation to the mis- and disinformation tracking capabilities, identification of voter interference and suppression, and the overall real-time transparency provided to fact-checkers, journalists and other relevant electoral stakeholders,” the Commission said.

In addition, the Commission has questions about the lack of visibility of political content and mechanisms to flag illegal content.

Operations center

An EU official said that there is no specific timeline for when Meta needs to make the changes. The Commission expects Meta to cooperate, having already had some constructive conversations.

“We are confident they will act quickly. It’s in nobody’s interest that the website is exploited by Russian actors, and it’s fundamentally wrong that they make money on this,” the official said.

Under the DSA, companies designated as a Very Large Online Platform (VLOP) – online platforms with more than 45 million monthly average users in the EU — must abide by strict rules, such as transparency requirements and the protection of minors online.

Meta said earlier this year that it was setting up its own operations centre for the elections “to identify potential threats and put mitigations in place in real time”. In a separate statement, Facebook’s parent company said that it planned to start labelling AI-generated content in May 2024.

In response to today’s investigation, a company spokesperson told Euronews that a “well-established process for identifying and mitigating risks on our platforms” is in place.

“We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

Stress test

Meta, TikTok, X and other online platforms were invited by the Commission last week (24 April) to stress-test election guidelines under the DSA aimed at helping very large online platforms and search engines to mitigate risks that may impact the integrity of elections and their services.

Today’s probe comes following the launch last week of another Commission investigation into TikTok, and probes commenced earlier this year into X – also related to illegal content – and AliExpress related to compliance with the DSA.

Meta itself also filed a legal complaint at the General Court in Luxembourg in February for having to pay a supervisory fee imposed by the Commission under the DSA.

Related posts

Iran Considers Sweden’s Statements “Malicious” and Summons the Swedish Chargé d’Affaires in Tehran

admin1

Denmark… Parliament rejects bill to recognize the State of Palestine

admin1

Emmanuel Macron demands Russia releases jailed French citizen

admin1

Leave a Comment