The Platform Accountability and Transparency Act, Take Two

John Perrino / Dec 22, 2022

John Perrino is a policy analyst at the Stanford Internet Observatory.

The U.S. Senate Floor. Wikimedia

The Platform Accountability and Transparency Act (PATA), introduced on Wednesday, would give researchers at universities and nonprofit organizations in the U.S. access to study data from the largest social media companies and provide public transparency on the most widely shared posts, advertising, content moderation practices and recommendation algorithms.

The bill, co-sponsored by Sens. Chris Coons (D-DE), Rob Portman (R-OH), Amy Klobuchar (D-MN) and Bill Cassidy (R-LA), received endorsements from a range of voices after it was released as a discussion draft last December — including former President Barack Obama and the Washington Post editorial board. Now, the newly introduced legislation builds on the draft by expanding the class of researchers who can gain access to social media data, adding public transparency requirements, and adding important data security provisions.

The social media landscape has changed in important ways in the months since the draft bill was released, including heightened safety and security concerns about TikTok, and the chaos surrounding Elon Musk’s acquisition of Twitter.

“Does TikTok promote or suppress posts in the Chinese Communist Party’s interest? Do platforms recommend harmful or addictive content to vulnerable users? The bipartisan Platform Accountability and Transparency Act would help answer questions like these,” Sen. Coons said in the announcement.

Among the most substantial changes from the draft bill are provisions to include nonprofit organizations in a researcher consortium program alongside university researchers, and more specific requirements for publicly accessible databases and metrics on viral and other high-engagement public content, including the actual text and any accompanying links or media. There are also transparency reporting requirements on content moderation practices, how content is recommended to users, and protections for researchers who independently access public data and adhere with privacy and security guidelines.

The provisions are extended to cover augmented and virtual reality platforms in addition to social media websites and applications. Just as in the draft, the National Science Foundation and Federal Trade Commission would have joint responsibilities to facilitate and develop best practices for the research sharing program and public transparency reporting.

Critics of the draft bill had expressed concerns about data security protections or the misuse of information for commercial or government purposes. Without a national privacy law for baseline protections, various data protection measures were added. The legislation explicitly excludes researcher access to direct or private messages, biometric data or location data. Datasets must be encrypted when they are shared with researchers and anonymize user information included in the data. Information will be collected on who is accessing the shared data and what searches they make to hold researchers accountable for the appropriate use of the data for research purposes. The bill also notably attempts to close potential legal loopholes for government or law enforcement access to the data shared with researchers.

The legislation’s introduction comes as Twitter is in the crosshairs of legislators and regulators around the world who are raising safety and free speech concerns about the platform under Elon Musk’s ownership. Twitter has long been relied upon by researchers as it is perhaps the most open for researchers to access and study online conversations. TikTok is under fire over national security concerns because of its China-based parent company, Byte Dance.

“We are learning the extent Big Tech companies will go when no one can shine a light on what they are doing. Congress needs to have the information to hold these companies accountable. Our bill increases transparency into data collection by social media companies,” Sen. Cassidy said.

The legislation attempts to strike a complicated balance between meaningful transparency and privacy and could put the U.S. back in the driver’s seat as the leader in developing rules for social media companies. The forthcoming EU Digital Services Act will require the largest social media platforms to submit to audits, transparency reporting, and provide researcher access, but the European regulations do not contain the same provisions for public access to high engagement, or viral, content, nor does it explicitly give nonprofit researchers the ability to access social media data as it does for university researchers.

The legislation notably draws upon prior research and collaborations in its understanding of what researchers need to conduct studies on the effects of social media, including an understanding of what data is available to them — an important aspect highlighted in a recent Center for Democracy and Technology workshop report on the subject. The legislation also incorporates a New York University proposal for a privacy-preserving protocol for public access to advertising and high-reach social media content.

While there is no House companion legislation, the Digital Services and Oversight Act (DSOSA), introduced in February by Rep. Lori Trahan (D-MA), includes similar transparency provisions for social media companies. If passed, PATA could help address a range of concerns about online content moderation — spanning GOP concerns that too much content is being taken down, to Democratic concerns that social media companies are not addressing misleading information or hate speech and shared concerns, and what appear to be bipartisan concerns about children’s safety and teen mental health.

PATA represents a new American approach to addressing a wide range of concerns with the impact social media companies have on society. It could help policymakers and researchers take the first step to better understanding potential harms which can lead to more tailored and effective legislative and regulatory remedies. How the industry will react to the terms of this legislation remains to be seen, but tech executives gave the broad contours of PATA their tacit approval at a hearing in the Senate earlier this year.

Without shining a light on the social media companies that collect troves of data on us, their users, we cannot hold them accountable or take meaningful action. Whether this legislation can advance will be a key question for those concerned about tech policy to follow in the new year.


John Perrino
John Perrino is a policy analyst at the Stanford Internet Observatory where he translates research and builds policy engagement around online trust, safety and security issues.