Home

Will the Facebook Papers be the Catalyst for the Big Tech Reboot?

Rebekah Tweed / Nov 15, 2021

Rebekah Tweed is the creator of the Responsible Tech Job Board and the Program Director at All Tech is Human.

Are the Facebook Papers brought forward by whistleblower Frances Haugen-- evidence that the company’s leaders repeatedly and knowingly ignored the public good while enabling harm and violence in pursuit of profit-- enough to finally inspire the democratic action necessaryto better align Big Tech with the public interest?

The Cambridge Analytica scandal didn’t do it. Previous whistleblower Sophie Zhang’s damning revelations about the way Facebook is used by governments to manipulate the public weren’t enough(for reasons explored in A tale of two Facebook leaks, an article in Columbia Journalism Review). Even the storming of the U.S. Capitol on January 6th, 2021 has failed, so far, to spur reform. Nothing yet motivated our collective resolve to resist the threat to democracy posed by a tech industry bound only to its mandate to maximize shareholder value, regardless of its effects on society and beholden to the whims and dictates of unaccountable CEOs.

Against this backdrop, a trio of Stanford University professors-- Rob Reich, Mehran Sahami, and Jeremy M. Weinstein-- a philosopher, a technologist, and a policymaker, respectively-- build a case in System Error: Where Big Tech Went Wrong and How We Can Reboot for citizen activism, inspiring public engagement to address the perilous state of our democracy. The authors make a strong case for various positions: that democracy is demonstrably preferable to technocracy; that democracy’s role is to act as a guardrail for the worst outcomes; and that the tech industry cannot currently self-regulate, despite the industry’s insistence to the contrary.

The authors muse that democracy is the preferred form of government for a few reasons. Considering forms of governance against their capacities for “identifying and mitigating the harms and suffering we want to avoid,” they argue, “democracies have generally excelled: avoiding mass starvation; preventing nuclear war; eliminating extreme poverty and suffering.”

But this is not necessarily the view amongst tech executives. Reich shares an anecdote of a discussion at a small dinner of unnamed Silicon Valley elites musing about a new state “to maximize science and tech progress powered by commercial models.” Reich raises the question of governance to the table, and the response was telling. “Democracy? No. To optimize for science, we need a beneficent technocrat in charge. Democracy is too slow, and it holds science back.”

The authors contend that democracy and its focus on free expression and individual dignity are competing values that cannot be optimized for equally: “If a fervent commitment to free speech is threatening democracy by allowing misinformation and disinformation in our public square, it is also threatening individual dignity.” The Big Tech-enabled “marketplace of ideas” is hopelessly submerged in a flood of algorithmically amplified disinformation, where countering bad speech with good speech is equivalent to un-yelling “fire” in a crowded theater, and efforts to correct misinformation are as pointless as an earnestly appended retraction after the stampede for the exits has already begun.

System Error lays out policy prescriptions to back up its theories:

  • The authors advocate bringing technologists into the public sector as “tech teammates” through flexible mechanisms outside of formal civil service channels;
  • Reanimating the zombie Office of Technology Assessment (OTA), an office of the United States Congress founded in 1972 and defunded by Newt Gingrich in the mid-90’s, which could provide a channel for informing politicians and policymakers on technology, a role now played mostly by Big Tech lobbyists; and
  • Enabling “adaptive regulation” through so-called “regulatory sandboxes” to address the chasmic lag-time between the “move fast and break things” (or “race-to-MVP and lock in network effects”) pace of emerging technological progress and the successful passing of regulations.

Regulatory sandboxes describe a process that involves proof of concept, “forking” the existing body of law most relevant to the innovation at hand, provisionally granting the innovator permission to deploy the technology but also devising regulations for the new system for a year, and reconvening to weigh the benefits and drawbacks This type of system could provide some of the requisite agility that policy-making is sorely lacking in a democracy that desperately needs to keep up with the existential threats against it; threats which are enabled by the very technologies it has been ill equipped to promptly govern.

System Error suggests that public attention and participation are required to jumpstart the necessary policy changes required to safeguard democracy. Just as Industrial Revolution-era labor laws were sparked by the Triangle Shirtwaist Factory fire, or healthcare industry guardrails were partially inspired by the Tuskegee Experiment, are the revelations of the Facebook Papers the catalyst necessary to finally regulate the tech industry?

Regulation is especially urgent because democracy is in so perilous a position, as the information ecosystem has already been so degraded by disinformation and polarization. The authors hint that things could get worse, thanks to synthetic media technologies such as OpenAI’s GPT-3, which may make it possible to flood the internet with a cacophony of machine-generated content.

But, we don’t actually need GPT-3 (or similar tools that could soon be readily available) to supercharge and automate a deluge of disinformation when Facebook’s algorithms are already so successfully amplifying and rendering viral the paltry efforts at disinformation disseminated by people taking advantage of the economics of social media, like the small cadre of Macedonian teenagers who ginned out false stories prior to the 2016 U.S. presidential election. Today’s volume of disinformation, algorithmically amplified, is already successfully drowning democracy by flooding the public square with a targeted and artificially boosted firehose of bile. The volume of bad actors can remain relatively low because the platforms amplify extremist messaging.If this is already happening without GPT-3-style tools, what happens when the automation is coming from both propagandists and platforms?

Importantly, the authors emphasize that these problems are not just about Facebook, and not just about Mark Zuckerberg, because “there are literally hundreds of would-be Zuckerbergs who are in the pipeline.”

Therefore, the time is now. If the Facebook Papers can be used to engage the public, perhaps we can transform the attention into collective action through our democratic institutions. The bright spots are ample: there is a bipartisan appetite for regulating Big Tech, though the political right and left are motivated by different values-- contrast Elizabeth Warren’s larger push for stakeholder capitalism in her Accountable Capitalism Act with Josh Hawley’s concern about censorship of conservative voices. There is a brief window open during President Joe Biden’s administration during which there are motivated personnel in place to make moves toward meaningful regulation. Key appointments include Alondra Nelson as deputy national science advisor (also, notably, “President Biden elevated the national science advisor to the Cabinet for the first time”); Lina Khan, a specialist in antitrust and competition law as Chair of the Federal Trade Commission; and, just announced last week, Meredith Whittaker, a core organizer of the 2018 Google walkout to the role of Senior Advisor on AI to the FTC; among others.

With reform-minded leaders in place, there are also substantial efforts underway to build a bench. There are new initiatives to encourage technologists to enter public service, from New America’s Public Interest Technology University Network to programs such as the Congressional Innovation Fellows, the Presidential Innovation Fellows, and the just-launched U.S. Digital Corps. These programs will create a path for people that want to work on these issues, and universities are training the talent. For instance, System Error’s authors at Stanford University created one of the most popular courses on campus, “Ethics, Public Policy, and Technological Change,” and will likely have an influence on computer science curriculum more broadly and potentially the overall education of future tech industry employees.

The Facebook Papers can be the catalyst for the public attention that System Error contends is required. The policy prescriptions for reigning in Big Tech are persuasive; bipartisan appetite to legislate anything at all amidst the gridlock should not be squandered. The personnel is in place. The public has been inspired. The Big Tech Reboot must come now.

Authors

Rebekah Tweed
Rebekah Tweed is a leader in Responsible Tech and Public Interest Technology careers, talent, and hiring trends. She is the creator of the Responsible Tech Job Board, the Program Director at All Tech is Human, and the Assistant Producer of A BETTER TECH, 2021 Public Interest Technology (PIT) Convent...

Topics