Major Tech Platforms Fail to Deliver on EU Fact-Checking Commitments, Risking DSA ComplianceCarlos Hernández-Echevarría / Jan 11, 2024
In 2022, major technology companies including Google, Meta, Microsoft, and TikTok signed the EU Code of Practice on Disinformation, a voluntary set of “self-regulatory standards to fight disinformation.” The Code is slated to become an enforceable Code of Conduct under the Digital Services Act (DSA), which came into effect last year and will be fully implemented in February, and it includes a chapter on the empowerment of fact-checkers that stipulates (1) that platforms conclude agreements with independent fact-checking organizations to have complete coverage of the EU member states and official languages, (2) that they integrate or consistently use fact-checking in their services for the benefit of their users, and (3) that they provide fact-checkers with access to the data that they need to maximize the quality and impact of their work.
To evaluate how well the technology platforms are delivering on these objectives, the association of European fact-checking organizations, EFCSN, has reviewed how and if the major social media and search platforms are fulfilling the commitments they made 18 months ago. The results are not encouraging: with few exceptions, most of the platforms are not implementing the measures they committed to, openly reneging on some of the promises they made, or worse – misrepresenting their policies in their reports to make them look like they align with the Code.
Some of those could even be funny if the issue were not dead serious: YouTube reports partnerships with “EU based fact-checking organisations” in countries such as Myanmar or Indonesia. Others are genuine actions, but with extremely limited impact: TikTok reports to have “fact-checking coverage” in multiple EU member states, but admits to have reviewed only 15 videos in some of them over six months. Microsoft’s Bing committed to show fact-checks to users when they searched for a debunked claim, but didn't show a single one on the first page of results in 15 different countries over the same period.
The 2022 Code of Practice on Disinformation was the result of long, complicated negotiations that produced a compromise acceptable to both the biggest tech companies in the world, the most active anti-disinformation and digital rights civil society organizations, and many other relevant actors. It was also stewarded by the European Commission. It had everything that previous self regulation had lacked: detailed measures, concrete metrics, and a permanent task force to keep the collaboration going. Most importantly, it was conceived from the very beginning to become an official Code of Conduct on risk mitigation under the DSA.
Eighteen months later and with the DSA in full force for Very Large Online Platforms (VLOPs) and Search Engines (VLOSEs), the bigger players in the tech industry have not been able to implement even the measures they volunteered to apply. The EFCSN report relies on the data the companies themselves have published in their DSA and Code of Practice Reports, and even if the last full reporting period (January-June 2023) was the one right before the official full deployment of DSA for the larger services, it seems that the ability of some of them to put “effective risk mitigation measures” against disinformation in place so far is not a given.
As we enter 2024, a year of elections in the EU and beyond, many of those platforms are probably preparing to deploy new measures against disinformation and manipulation, and some of them are said to be “exploring” ways to empower their users with additional information and context when they encounter disinformation in their services. However, EU citizens do not have the luxury of having 18 more months of inaction and back-and-forth, nor should regulators give a pass to the biggest industry on the planet if they are unwilling to abide by the EU laws on protecting their users from lies and manipulation.