What Does CrowdTangle's Demise Signal for Data Access Under the DSA?

Megan A. Brown, Josephine Lukito, Kai-Cheng Yang / Mar 27, 2024

Meta's CrowdTangle will be deprecated in August, according to the company.

Last week, Meta announced that CrowdTangle, a tool commonly used by researchers and journalists to shed light on what goes on on Facebook and Instagram, will sunset in August of this year. In 2024, 64 countries–nearly half the world–are holding elections, and key transparency tools for social media platforms are increasingly inaccessible. Against the backdrop of profound global humanitarian crises, the violence of war, and the broad threats to civil liberties across the globe, the consequences of this election year are dire. Digital infrastructure is a core piece of the puzzle: it is a source of information, storytelling, organizing, and newsworthy commentary, but it also enables the proliferation of hate speech and mis/disinformation, including election lies that can contribute to offline violence and strife.

For years, watchdog groups, election watchers, activists, researchers, and journalists have used CrowdTangle and other social media transparency tools to track online hate speech, report on violence against minoritized groups, and study the spread of extremist content online, among a range of other essential public interest topics. While efforts such as the Meta 2020 US election research project, a set of papers produced through industry-academic partnerships where academic researchers collaborated with researchers within Meta, have produced compelling research, it is equally important to support independent research and hold platforms to account. Independent research is so critical, especially since projects like the Meta 2020 project are unlikely to be repeated either within Meta or at other platforms, and other internal research conducted by the platforms may never be made public. The sunset of CrowdTangle is a devastating loss for transparency efforts that make essential independent technology research possible. It’s another blow to researcher data access since Twitter (now X) and Reddit ended free data access last year.

Despite the loss of these platform transparency tools, Europe’s Digital Services Act (DSA) opens up a promising avenue for access to platform data. Under Article 40 of the DSA, Very Large Online Platforms (VLOPs) are required to provide access to real-time public data to researchers studying a variety of “systemic risks” to the European Union, including risks to public health and security, civic discourse, free expression, electoral integrity, and more. Importantly, the law doesn’t specify that researchers must be based in the EU to be eligible for data access, but it also doesn’t give guidance on how platforms should interpret the relevance of research questions within the scope of the law. It does not specify exactly how platforms should provide access to this data either. All of these unknowns open the door for varying interpretations of what data platforms owe to whom and how that data is made available.

The promise of robust access to public data for researchers under the DSA rests on its implementation. In this year of global elections, the independent research community is relying on good faith efforts by platforms to design programs that facilitate broad access to public data for public-interest purposes, process researcher applications in a timely manner, and ultimately grant access to data. Researchers have already noted where some transparency programs, such as TikTok’s Researcher API, are missing the mark. Many of the other public data access programs are only available to researchers at academic institutions, leaving out journalists and civil society advocates who are doing the important work of holding platforms to account. Meta’s Content Library, the heir apparent of CrowdTangle, is also difficult for non-academic researchers to access.

Since the DSA doesn’t mandate how platforms must provide this data—only that they must do it—at the moment, the jury is still out on whether the research community is being positively served by tools like the Content Library and others. That’s why we launched the Data Access Audit.

Together with the Coalition for Independent Tech Research and George Washington University’s Institute for Data, Democracy & Politics, we launched a project last week to audit the state of researcher access to data under the DSA. We’re aiming to provide an objective assessment to policymakers, platforms, and the public about how well these new tools are enabling research, journalism, and other public interest work, and the areas where they may not be. You can participate in our Data Access Audit by taking this survey.

We’ll be looking carefully at the Content Library and other transparency programs to determine where they’re falling short and who is being left behind. If we’re successful, with the help of the global research community, we’ll be able to advocate with evidence for clearer guidance from policymakers about what platforms must do to comply with the data access provisions of the law. The new information provided by this survey will also help us negotiate with platforms directly to design features and improvements to their data access products.

Despite the DSA’s promise of greater transparency and access to data, we’re watching CrowdTangle and other platform APIs disappear before our eyes, potentially without viable alternatives in place. It is dangerous and counter-productive to box out hundreds of researchers, journalists, and public advocates who depend on platform transparency tools every day to understand and report on our media and social environment during one of the most consequential years—and ultimately, the public will lose out if this happens. The only way to know whether platforms are boxing us out or inviting us in is to organize—and we hope that you’ll join us.


Megan A. Brown
Megan A. Brown is a PhD student at the School of Information at the University of Michigan. Her research is centered on the online information ecosystem. She studies political bias in algorithmic systems, the effect of platform governance and moderation policies on the spread of political content, a...
Josephine Lukito
Dr. Josephine ("Jo") Lukito is an Assistant Professor at the University of Texas at Austin’s School of Journalism and Media. She is also the Director of the Media & Democracy Data Cooperative and a Senior Faculty Research Affiliate for the Center for Media Engagement. Jo uses mixed methods and compu...
Kai-Cheng Yang
Kai-Cheng Yang is a postdoctoral researcher at Northeastern University. He obtained his Ph.D. in Informatics from Indiana University Bloomington. He is interested in computational social science with a focus on identifying bad actors, such as malicious social bots, and studying the diffusion of misi...