Home

Donate
Perspective

What’s the Equivalent of Grounding Planes in the Tech Sector?

Faun Rice / May 19, 2025

Faun Rice reviews Darryl Campbell’s Fatal Abstraction: Why the Managerial Class Loses Control of Software, published by W.W. Norton on April 8, 2025.

One evening in May 2018, 49-year-old Elaine Herzberg, a grandmother of seven and a pro wrestling fan, pushed her bike across a road in Arizona near Tempe Town. Earlier that day, Rafaela Vasquez, an Uber operator and former driver, had checked in for her shift. Uber was testing its new autonomous vehicle prototype. Vasquez’s job was to monitor Uber’s systems to ensure that sensors were working correctly, while keeping an eye on the road. 

What happened next is recounted by Darryl Campbell, author of Fatal Abstraction: Why the Managerial Class Loses Control of Software, who was working in operations at Uber at the time. As the autonomous prototype sped toward Herzberg, its radar struggled to classify her. She pushed her bike across the road in a trajectory similar to a car slowly changing lanes. The software expected only cars on this road, not pedestrians, because there were no sidewalks or crosswalks. Herzberg’s silhouette—the bike and person—was also hard to classify. By the time the car’s sensors finally identified Herzberg as an object moving into the vehicle's path, the emergency brake activated.

When Campbell started in tech, he felt like he was on the cutting edge, helping build utopian applications. Then he joined Uber. Between 2016 and 2018, Campbell said when we spoke, he “began to realize that the utopian parts were only lip service,” and that some companies used technology “to extract wealth from people, and especially funnel it from vulnerable populations like Uber drivers or temporary labor into the pockets of billionaires, large investment companies, and so on.” Fatal Abstraction asks us to recognize that software rarely goes rogue—instead, it’s steered by management decisions made in isolation, without consequences for the decision-makers.

The book details a series of cost-cutting measures Uber relied on to roll out self-driving test cars before competitors. Some issues Campbell identifies are embedded in the cars’ technology and choice of sensors. Others are in efforts to save staffing costs. In early testing, Campbell reports that two human operators monitored every autonomous drive, one reviewing the car’s systems, the other keeping their eyes on the road. In October 2017, Uber cut the two-operator policy, giving both tasks to a single person in a move that ignored industry best practice. Uber’s autonomous vehicles team decided that a hard automatic brake caused rider discomfort, so the car instead decelerated over the course of three seconds. Furthermore, early autonomous car prototypes were too eager to brake. Campbell reports that Uber built in a “mandatory one-second delay every time the emergency brake was triggered, ‘while the system verified the nature of the detected hazard.’”

Both Vasquez and the car were too late to react: by the time the vehicle and the operator saw Herzberg and braked, it was too late to avoid a collision. Legal teams later debated whether Vasquez was on her phone or monitoring car systems: Campbell reports that her personal phone was far away in the cubby on the passenger-side door. Years later, Vasquez was charged with endangerment while the company was determined not to be at fault.

Campbell’s critique of Uber is not merely that it experimented with autonomous vehicles, but that Uber’s leadership created self-imposed deadlines and cut costs to deliver an unsafe product, with inadequate guardrails. These choices had fatal consequences. And it’s this pattern that Campbell outlines again and again, throughout Fatal Abstraction: unilateral management decisions by a single team of players that focus on the bottom line without having to price in any of the potential consequences of their decisions on users or the public.

Campbell, a former economic historian, begins Fatal Abstraction with a short history of the MBA. He draws a direct line from MBA “managerialism”—believing that business decisions are best made using financial abstractions—to the Boeing 737 MAX crashes, an infamous case of software gone wrong. In the rest of the book, Campbell draws on his experience in the technology sector to give other examples of what he calls “managerial software,” including Meta and the Rohingya genocide in Myanmar, and generative AI extracting artists’ intellectual property.

Fatal Abstraction establishes that software failures result from a pattern of decision-making that often boils decisions down to lines in a spreadsheet, with little risk borne by decision makers. When we spoke, Campbell said he liked to use organizational theorist Henry Mintzberg’s onion metaphor of management (from Mintzberg’s 2009 Managing). “Org structures aren’t nicely organized flow charts where information flows easily, but more like an onion,” he explained. “The closer you get to the center of the onion, your decision-makers, the less you can see what's happening outside on the periphery. The periphery is really where you start to get into some of these externalities and safety questions, and it’s just impossible for someone who spends most of their time at the core to really see what's going on out there.”

Problems caused by software scale rapidly. After reading through Campbell’s examples, a reader starts to see the same pattern repeating everywhere: in software’s easily hidden climate impacts, or in the housing market, where Airbnb inadvertently reduced rental stock in cities and drove up rent prices around the world.

As a tech industry insider, Campbell ends with a call to his peers: white hat hackers, whistleblowers, and unionizers. He wants internal pressure to fix the problem, fostering better social responsibility within companies. When we spoke, he raised that tech workers were starting to see that their own jobs were at risk from managerial software in the form of generative AI coding assistants. “People are willing to spend 50% less to get 40% less quality: that is a conscious decision that they've made in terms of things like customer service chatbots,” he explained. “Now that's really starting to hit home, so I'm hopeful that that also spurs some action.”

Campbell calls on industry insiders to step forward because he feels that regulation has failed to check the impacts of managerial software. He argues that many policymakers lack the expertise to review technology, and he gives the example of an audit of the Federal Aviation Authority (FAA)’s capacity to assess technical insufficiency in the Boeing 737 MAX software. He also raises issues such as lobbying, and rightfully notes that for many of his examples, penalties driven by the legal system have been slow to come to fruition, such as the Rohingya hate speech lawsuit against Meta. Still other efforts to price software’s negative externalities into its costs have often fallen on the consumer, like cities that add an extra charge for using ride-hailing downtown to offset curbside idling.

Fatal Abstraction left me with one central policy question: have we ever seen “mandatory grounding” in the pure software industry, and will we ever? Campbell’s book compares software with and without physical interfaces: the Uber AV crash and the Boeing 737 MAX failures, versus generative AI, and Meta in Myanmar. At least with Uber’s vehicles and Boeing’s planes, software was nested in physical vehicles with a stronger history of safety regulation. The book reminded me of something Replika founder Eugenia Kuyda said in a recent interview on The Slow Newscast. She was asked if children should be allowed to use AI companion applications—a question that came just after news of a teen’s suicide linked to Character.AI. Kuyda compared her business to the tobacco industry and said regulation would need to come from the state. “Kids are not allowed to buy tobacco today, and they probably will not be allowed on social media very soon,” she said. “But, you know, for now, we're in this era where it's like Pan Am and everyone’s smoking in the cabin.”

Without clear opportunities for intervention—the software equivalent of grounding a plane—Campbell’s examples show a trend of managerial software continuing to cause indirect, accumulating harm.

Authors

Faun Rice
Faun Rice is a social researcher and writer based in Vancouver, BC. Trained in anthropology and sociology at the University of Alberta, Faun began her career as a language revitalization researcher and then a museum researcher at the Smithsonian Institution. Today, she is the Manager of Research & E...

Related

Critical Perspectives on Ethics in TechnologyApril 7, 2022
An FDA for AI?December 24, 2023

Topics