Will Europe’s Digital Services Act Encourage Global Platform Election Integrity Standards?

Justin Hendrix, Ben Lennett / Jun 5, 2024

Brussels, Belgium - November 3, 2022; European Flags in front of the European Commission Headquarters building. Shutterstock

With the election season underway in dozens of democracies around the world, experts remain concerned about whether social media platforms have sufficient election integrity policies and resources in place to address challenges such as disinformation, election delegitimization, synthetic and manipulated media, threats of violence against election officials, and more. In the US, civil society organizations continue to raise alarms in the runup to the November elections, but the major platforms appear satisfied they are doing enough. Meanwhile, in the European Union, which holds parliamentary elections later this week, regulators are deploying new powers under the Digital Services Act (DSA) to launch investigations into platforms and to establish standards for election integrity.

In the long run, the so-called “Brussels effect” may be the best hope to force platforms to align with what appears to be an emerging consensus on necessary measures to support election integrity on social media. That consensus is discernible in documents released by advocates, experts, and even EU regulators in recent weeks, including a set of recommendations raised by a coalition of US groups, a document produced by Meta’s Oversight Board that contains a similar set of recommendations, and most notably by the new set of guidelines on election integrity issued by the European Commission. How or if any of the baselines established by the EU will result in platform policy or product changes that answer the concerns of watchdogs in the US and beyond is a question worth considering.

US experts and advocates get the cold shoulder from platforms

Despite what might appear to be diminishing returns, advocates for platform accountability in the US continue to demand more attention to election integrity issues. For instance, in April, a coalition of ~200 civil society organizations, researchers, and advocacy groups sent letters to 12 platform CEOs calling on them to expand and improve their election integrity efforts. The letter included six specific requests, including that platforms commit to “staffing up critical platform-integrity teams,” swift enforcement of “rules against election lies and hate in political advertising,” “disclosure of AI-generated political content, including ads,” consistent moderation standards for “influencer, public figure and political candidate accounts,” and greater transparency to “enable civil-society oversight of enforcement practices.”

The letters, however, were largely met with indifference from industry. Per a summary report prepared by the advocacy group Free Press, the eight platforms that chose to answer offered few additional commitments, indicating that the industry on the whole sees little need to substantively engage on these issues. As the report details:

Eight of the 12 companies responded on or around the April 22, 2024 deadline. These included Google and YouTube (through a Google representative), Meta and Instagram (through a Meta representative), Pinterest (not linked here because Pinterest marked the letter privileged and confidential), Reddit, Snap and TikTok. Six of the eight responses, excepting the ones from Google and Snap, hovered at barely two pages of text, hardly enough to substantively answer the questions presented.

X confirmed receipt of the letter but has not responded substantively. Discord, Rumble and Twitch failed to respond altogether. This shows utter disrespect to the more than 200 civil-society organizations, researchers and journalists that signed the letter, as well as a shocking disregard for the precarious state of global democracy this year.

Moreover, the report indicates that none of the companies that responded directly indicated whether they would adopt any of the coalition's recommendations. It also indicates that none promised to do anything different regarding content intended to advance the false claim that the US 2020 election was stolen. And none committed to increasing staffing for election integrity efforts to enable more effective content moderation.

The requests from the coalition were not the first made by civil society organizations in the US, nor will they be the last this cycle. But, the meager response of the platforms indicates that the companies feel they have little responsibility to engage with these groups or their concerns substantially. Indeed, Free Press concluded that “[t]aken together, the responses from the platforms are wholly insufficient and demonstrate a lack of seriousness across the industry about the precarious state of elections around the globe.” Without a clear and enforceable legal mandate as in Europe, the companies have substantially less incentive to listen to the concerns of civil society.

Meta’s Oversight Board weighs in

Shortly after Free Press issued its summary report, Meta’s quasi-independent Oversight Board entered the fray with a white paper that summarized key lessons for Meta and other social media platforms to consider in order to “better protect political speech and counter online challenges to the safe and reliable conduct of elections, under the guidance of international human rights standards.” Much of the language is congruent with what the coalition of US groups recommended. The Oversight Board encouraged platforms, in particular, to focus content moderation efforts on “political speech that incites violence,” arguing “content that outweighs the public interest, and which fundamentally undermines the election process, must be expedited for human review and removed when necessary.”

The Oversight Board urged platforms to establish standards “for AI-generated content or ‘deepfakes’ and other types of manipulated content such as ‘cheap fakes,’” and to take steps to “improve their design and policy choices to ensure that disinformation narratives are not amplified.” As important, it stressed the need for a “set of basic global platform standards” and consistent enforcement by platforms, which should dedicate “sufficient resources to moderating content before, during and after elections, doing so on a global scale irrespective of whether they have political or economic interests in the affected country.” The paper’s key lessons and recommendations, like most Oversight Board recommendations, are non-binding to Meta.

Europe has a new regulatory stick and appears ready to use it

While US civil society groups and advisory bodies like the Oversight Board have little more recourse than to try to influence platforms in the court of public opinion, under the DSA, which went into full effect in February, the EU has regulatory tools to compel platforms to act, including investigative authorities that allow it to demand information and seek compliance with the law. The European Commission’s announcement in April that it would open formal investigative proceedings against Meta, the parent company of Facebook, Instagram, and WhatsApp, indicates the Commission is keen to use these new powers to the fullest extent possible.

And, the Commission is zeroing in on election integrity concerns more generally. Just days before the announcement of the Meta investigation, the Commission released “Guidelines for providers of Very Large Online Platforms and Very Large Online Search Engines on the mitigation of systemic risks for electoral processes.” These guidelines include provisions requiring platforms to demonstrate that they are “reinforcing internal processes” related to elections, ensuring there are sufficient content moderation resources, setting up “dedicated, clearly identifiable” teams focused on elections, implementing “mitigation measures linked to generative AI,” and showing a willingness to “collaborate on, implement, invest and engage in” various media literacy initiatives.

The investigation into Meta focuses on several areas of concern that are also addressed in the guidelines, including “deceptive advertisements and disinformation,” “visibility of political content,” and “the non-availability of an effective third-party real-time civic discourse and election-monitoring tool ahead of the upcoming elections to the European Parliament and other elections in various Member States.” It also comes after Meta announced it would sunset CrowdTangle, a tool used by researchers to track various phenomena across Facebook and Instagram.

The extent to which platforms can or will be held to account under these guidelines is still unclear, but the investigation can arguably be read as a laboratory for the guidelines, and perhaps for DSA enforcement more broadly:

  • For instance, the Commission’s concerns about Meta’s “political content approach,” which the company has said diminishes political content in the recommender systems on Facebook and Instagram, relate to the guidelines’ emphasis on the role of recommender systems in shaping the information landscape and public opinion. The guidelines advise platforms and search engines to ensure that recommender systems “are designed and adjusted in a way that gives users meaningful choices and control over their feeds, with due regard to media diversity and pluralism.” Whether an opt-in feature will satisfy this concern remains to be seen.
  • The guidelines also emphasize that platforms should “facilitate access to official information concerning the electoral process, including information on how and where to vote, based on official information from the electoral authorities of the Member States concerned.” In announcing its investigation of Meta, the Commission indicated that it suspects Meta policies and practices may not address the dissemination of deceptive advertisements and disinformation that may hinder users’ ability to receive accurate and reliable information about the electoral process. “The proliferation of such content may present a risk to civic discourse, electoral processes and fundamental rights, as well as consumer protection,” says the Commission.
  • Importantly, the Commission’s announcement of its investigation highlights the necessity of independent scrutiny of the platforms, including data access for researchers concerned with the integrity of electoral processes. The guidelines state that "third party scrutiny and research into mitigation measures are important to help providers of VLOPs and VLOSEs ensure that the measures they put in place are effective and respect fundamental rights, as well as democratic principles." Meta’s deprecation of CrowdTangle, without an adequate replacement, raises concerns about the company’s commitment to transparency and its ability to meet the accountability requirements of the DSA.

Platforms operating in Europe that are not already under investigation over election-related issues would do well to consider the guidelines, which provide a comprehensive framework for tech firms to assess and mitigate systemic risks. But some DSA watchers urge caution when it comes to interpreting the guidelines, their relationship to the Meta investigation, and their potential effect on platform products and practices. Martin Husovec, an Associate Professor of Law at the London School of Economics and Political Science (LSE), writes that it is “not yet clear” if the Meta investigation is related to the release of the guidelines, and that even if it is, “the election guidance in itself will not be sufficient to create some minimum expectations of risk mitigation.”

Reasons for (tempered) optimism

Indeed, much remains unclear about how the DSA will work in practice. The EU’s investigation into Meta made headlines, as Husovec points out, but precisely what it will mean is as yet unknown. But eventually, the “Brussels effect” could still mean that some of the concerns raised by US advocates and by advisory bodies like the Oversight Board will be addressed by platforms that seek to comply with the DSA and to abide by its election guidelines. One way it might do so is simply by demonstrating what certain reforms, when required by law, can accomplish.

For instance, the aspects of the Commission’s proceedings against Meta that regard the deprecation of CrowdTangle may be one particular reason for optimism, says Rebekah Tromble, an associate professor in the School of Media and Public Affairs at The George Washington University and Director of the Institute for Data, Democracy, and Politics (IDDP). “It will be extremely difficult for Meta to credibly claim that it could and should restart CrowdTangle in Europe but leave the rest of the world in the lurch—especially in a year with so many important elections worldwide,” said Tromble. “They’d certainly hear from a lot of unhappy policy makers if they attempted that.”

Last week, CrowdTangle did announce it had prepared “live displays” of material related to the European parliamentary elections, possibly in response to the Commission’s inquiry. (A coalition of independent research and civil society groups says the displays are little more than a distraction aimed at pacifying the EU Commission in the wake of its inquiry.) But other changes that platforms make in response to the DSA could be instituted beyond Europe. For instance, Husovec believes the EU Commission’s action “will force platforms to take more steps to limit deceptive ads and improve authentication of who buys them.”

Of course, there are many factors that serve to temper such optimism. While the “Brussels effect” is generally understood by those in favor of regulation as a positive phenomenon, some experts point to the possibility of a “dark Brussels effect,” wherein compliance requirements in Europe shift platform resources away from other regions. And there is the possibility that even if platforms ultimately meet all of the requirements of the DSA, it won’t make much difference if political leaders continue to aggressively promote falsehoods, or if they decide to challenge the legitimacy of democratic elections or call for violence. Only time will tell. But for now, the DSA appears to be the best bet to demand more from platforms on election integrity.

Related Reading


Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...
Ben Lennett
Ben Lennett is managing editor for Tech Policy Press and a writer and researcher focused on understanding the impact of social media and digital platforms on democracy. He has worked in various research and advocacy roles for the past decade, including as the policy director for the Open Technology ...