Is It Ethical To Work At Facebook?

Justin Hendrix / Nov 16, 2021

The trove of internal research findings and other documents brought forward by Facebook whistleblower Frances Haugen brought further definition to a series of harms caused by the platform that many in civil society and academia have described for years. The documents bring a persistent pattern into sharp relief-- again and again, Facebook’s (now Meta’s) leadership prioritizes commercial interests over addressing harms to people.

These decisions appear to weigh heavily on employees of the company, whose posts on internal message boards reveal anguish. In her article on the Facebook Papers, titled History Will Not Judge Us Kindly, The Atlantic’s Executive Editor Adrienne LaFrance summarized the situation:

Again and again, the Facebook Papers show staffers sounding alarms about the dangers posed by the platform—how Facebook amplifies extremism and misinformation, how it incites violence, how it encourages radicalization and political polarization. Again and again, staffers reckon with the ways in which Facebook’s decisions stoke these harms, and they plead with leadership to do more…. And again and again, staffers say, Facebook’s leaders ignore them.

The disconnect seems especially hard for employees who joined the company to help it deliver on its stated mission to “give people the power to build community and bring the world closer together.” I was struck, in particular, by one quote from an employee on Facebook's internal message board, Workplace, after the January 6 insurrection at the U.S. Capitol that was reported in The Washington Post. “I’m struggling to match my values with my employment here,” the employee said. “I came here hoping to affect change and improve society, but all I’ve seen is atrophy and abdication of responsibility."

Indeed, such abdication has contributed to some pretty grisly outcomes- from teens experiencing suicidal thoughts (and presumably some unknown percentage acting upon them) to the facilitation of human trafficking and ethnic genocide. Some former employees have acknowledged the rising body count. When Sophie Zhang-- another whistleblower-- left the company because it failed to take action on use of the platform for political manipulation, she confessed in a leaving post, “I know that I have blood on my hands now.”

Given the evidence, is it possible to go on drawing a paycheck from the company in good conscience? How should employees reckon with the ethical dilemma of earning a living at a company that prizes profits over the well being of the billions of people that use its services, or, even if they don’t have an account on Facebook or Instagram or WhatsApp, are nevertheless impacted by the externalities these products produce?

Over the past couple of weeks, I’ve put that question to a variety of individuals who each bring different perspectives to it. What emerged is not an easy yes or no answer, but a set of considerations that may be useful to someone trying to answer the question for themself.

One important caveat: nearly everyone I talked with suggested the answer to this question may be different for different people, who must each bring their own life circumstances and personal considerations to it. In recent conversations, both Dia Kayyali, a human rights and tech activist and director of advocacy at Mnemonic, and Ifeoma Ozoma, founder of Earthseed and the creator of the Tech Worker Handbook, reminded me of the axiom “there is no ethical consumption under capitalism.” Indeed, more than one of the individuals I spoke with acknowledged that most jobs eventually present ethical challenges to varying degrees. But the situation at Facebook is particularly acute given the scale of the company’s impact, and is therefore worthy of consideration.

  1. You can’t trust the company’s leadership.

The ethical problems at Facebook are uniquely tied to the way the company is led. Don Heider, the Executive Director of the Markkula Center for Applied Ethics at Santa Clara University, told me that the way the company is governed is simply unhealthy. “What's frustrated people, especially about Facebook, is that because it has a CEO who is also the chair of the board, he ultimately is responsible to himself and no one else.”

For a company to be well governed-- especially one that operates at the scale and degree of societal impact of Facebook-- it requires the board to hold the CEO to account and expect the company to make ethical decisions. A board must comprise individuals “that really care about the company and also care about humanity-- who will make decisions for profit, but also be concerned with good, be concerned with human beings,” said Heider. “And so if you have that, then you can have some kind of balance in there.”

Zuckerberg is an acute part of the problem, said Justin Sherman, cofounder of Ethical Tech at Duke University. But Sherman also points to “other actors at Facebook who are also very much behind Zuckerberg and perpetuating a lot of harm.” Sherman includes in that group people such as the company’s Chief Operating Officer, Sheryl Sandberg, and even lower ranking executives such as Policy Communications Director Andy Stone, who Sherman points out “likes to mock and bully researchers and other people on Twitter.”

Hany Farid, a professor at the University of California, Berkeley and Associate Dean of its School of Information, hit a similar note. “When day in and day out, month in and month out, year in and year out, your CEO and your CTO and your CSO and your full C-suite continues to behave in exactly the same way-- you can keep telling yourself the story that, ‘hey, look, I can affect change from the inside.’ But the reality is that doesn't seem to be the case.”

Don Heider senses a particular problem at play in Facebook’s leadership. “When I see chronically bad decisions, what I see is decisions that are made on fear rather than confidence,” he said. “Facebook to me always feels like a company run out of fear-- the fear that at any given moment, it could all blow up. And here's the sad truth: it can, especially if you're not paying attention to how you're harming people.”

  1. The company’s business model is the fundamental problem.

Hany Farid suspects even a much more competent and well governed leadership may still be left with a business model that is fundamentally built on exploitation.

“I think at its core, the problem is really the business model of Facebook-- and social media in general-- just stinks,” Farid said. The company “is in the attention grabbing, ad delivery, privacy invading, outrage fueling business, and the way you attract people to your platforms is to get them angry, to get conspiratorial, salacious, hateful, outrageous content. And that's what drives business. That is the model that we have learned from social media works.”

Jared Harris, Associate Professor of Business Administration at the Darden School of Business at the University of Virginia and an expert on business ethics and corporate governance, allows that not all the facts are in on social media and its effects on society. But, Harris said that “if the fact pattern parallels what we saw play out in Big Tobacco a decade or two ago, then the only thing that's really different is the harms themselves.”

So, rather than lung cancer or emphysema, think of psychological or social harms. The Facebook whistleblower leaks fit that pattern-- we now know the company conducted internal research on such harms, but kept the results hidden from the public. “And so from an ethics perspective," said Harris, "all of the things that are problematic about suppressing the way in which cigarettes were harmful to smokers could potentially apply here. We can think of other examples as well-- there's a lot of stuff coming out recently about how much oil companies knew about climate change and again, suppressed the science internally. If indeed the harms are verifiable and the internal evidence was suppressed, we've seen similar examples play out before.”

Harris sees a disconnect between how the company makes money and the overall value proposition to its users. “What seems to have happened is Facebook seems to have gotten away from thinking carefully about the end user experience. Not just the way in which the end user experience could maximize advertising dollars, but the entirety of the end user experience-- what are the pros and cons, are there limits to how much we want the technology to act and behave. I think that's the headline here,” he said. “The conclusion to me is stakeholders who matter need a voice- and it looks to me like, you know, 11-year-old girls who use Instagram don't have much of a voice.”

  1. You can make an important difference, but the sum of the good will not redeem the whole.

Even if you acknowledge the problems in the company’s leadership and with its core business model, it may still be possible to do good work that impacts many people’s lives, especially given the magnitude of Facebook’s user base. While encouraging “a very realistic assessment” of the ethical challenges in working at or with Facebook, Dia Kayyali told me, “I have no doubt that there are people inside of Facebook who have probably saved lives,” and that there “are probably engineers who pushed for things that made a very, very real impact in people's lives.” Kayyali suggests a lot of this work falls under the category of harm reduction.

Johnny Mathias, Deputy Senior Campaign Director for Media, Democracy and Economic Justice at the advocacy organization Color of Change, said the world needs people willing to make such ethical judgments working at these firms. “I think the thrust of your question is, when people are doing work and they seem to be ignored, is it hopeless? It is probably a really challenging position to be in right now. We need to change the both external and internal environment in which people work. We need well-intentioned folks to be able to feel comfortable at technology companies, because I don't want technology designed only by the people who have no ethical questions about working on technology. That's how we get things that aren't designed for the needs of Black communities. If folks who design terrible, harmful tech are the only ones who are willing to participate in industry, that doesn't create the tech future that I think anyone wants.” He pointed to Color of Change’s “Beyond the Statement” tech platform, which encourages tech companies to focus on hiring people with civil rights expertise, for instance.

Some jobs are more ethically freighted, of course."If you work at Facebook some jobs do more harm than others,” said Justin Sherman. “Obviously people taking down non-consensual pornography are doing a good thing. There are people who work on the executive and lobbying teams who basically just lie to the media and bully critics all day, and that's not good.”

Yaël Eisenstat, who served as Facebook’s Global Head of Elections Integrity Operations for political ads in 2018 and is today a Future of Democracy Fellow at Berggruen Institute and a member of the Tech Policy Press masthead, said she hears from people weighing whether to work for the company frequently.

“My general advice to anyone weighing the pros and cons of working at Facebook, especially after everything that the public now knows, is this: If you believe you can make a difference within the constraints of a company that will not change its business model, growth mentality, and desire to dominate the landscape, then by all means, you should do the work," she said. "I would never discourage someone from trying to help protect the public from online abuse or harms.”

The key, Eisenstat said, is recognizing the bounds of what is possible. “If you are hoping to fundamentally change or affect how the leadership makes business and political decisions, then I would advise against it. I never went to Facebook thinking I would single-handedly change how the company operates. But now, more than ever, it would be a non-starter for anyone who still believes that internal employees will persuade Mark Zuckerberg, Sheryl Sandberg, and the rest of the executive team to fundamentally change their priorities.”

  1. The tech sector is an ethical minefield generally.

Gone are the halcyon days when the tech sector represented hope and opportunity without all the ethical baggage. “I have students who are at Google and Apple and Amazon and YouTube,” said Hany Farid. “All of them are looking around the tech sector like, what happened? I came here five years ago and we were the golden city on the hill, right? We were not Wall Street. We intentionally didn't go to New York and work at Wall Street because we want to be the good guys. And it turns out we're sort of the bad guys-- and for these young kids, it’s brutal.”

Ifeoma Ozoma said it’s hard to judge these companies differently on the merits. “I'm not going to tell anyone to leave their job because okay, you leave Facebook and you go to Alphabet-- are things any different there?”

Indeed, Don Heider sees similar systemic problems in other tech firms, such as Amazon. “Amazon to me is the closest cousin to Facebook in terms of current issues they face. They're both companies that have an interesting, good idea at the core, and they've been wildly successful beyond their or anyone's dreams. But with that huge growth and the huge profits and the huge success, there comes a moment of reckoning where you realize something is going wrong. So for Amazon, it's how they treat employees-- whether workers can take a few days off and not get fired, whether they can take a pregnancy leave.” Heider believes the only recourse is for these companies to step back and consider what sustainability will look like over a much longer time horizon- decades, not years, and certainly not quarters.

  1. You’ll have to be comfortable with the judgment of your family and friends.

Even if other tech firms face similar ethical issues, “people who work at Facebook are going to be facing a more awkward Thanksgiving than perhaps those who work at Google or Amazon,” said Color of Change’s Johnny Mathias. There has been substantial brand damage to Facebook, which will likely follow it even as it rebrands to Meta. Last week, a CNN poll found that “roughly three-quarters of adults believe Facebook is making American society worse…. with about half saying they know somebody who was persuaded to believe in a conspiracy theory because of the site's content.”

“I remember back in the day when I was still at Dartmouth College,” said Hany Farid. “If one of my students got an internship or a job at Facebook, they would have that tattooed on their face. I mean, that was like grabbing the brass ring. That was really the proudest moment. I will tell you today that I have many, many students-- both current and former-- who tell me, ‘I'm embarrassed to tell people that I work at Facebook.’”

The evidence of Facebook’s harms, he said, is now more well known to the public- not just to academics and activists. “Can you work at a company whose products have led to a body count that is not insignificant? Think Myanmar, think Ethiopia, Sri Lanka, the Philippines, India, Brazil.” Pointing to COVID-19 misinformation, Farid points out the death toll worldwide. “At least a fraction of those are as a result of Facebook and WhatsApp and Instagram. I don't know how you work there. I really don't.”

  1. Your day of reckoning will come- be ready to weigh your options.

Don Heider said for every person working at a company experiencing a crisis, “there is a day of reckoning. That comes when they feel like they can no longer work towards good and have an effect, whether they think the needle is moving in some way.” For many, the signal will come much as it did for Haugen or for Zhang- when an initiative to change things or address harms dies when it reaches management. “I have great admiration for the people that are in there fighting the fight, trying to do good things. I really do,” said Heider. “But at the end of the day, if every initiative that started to change anything for the good dies in the C-suite, then you get a very clear message, no matter what's coming out of the CEO's mouth. You see the reality.”

Jared Harris said if a former student working in a difficult situation and contemplating blowing the whistle approached him with the question about what to do, he’d try to help them assess their own objectives. Some might want to take action, or perhaps become whistleblowers themselves. “Whistleblowers face a lot of tough repercussions, and I'm not sure I would expect a student or former student to have thought through, are they ready to take on the personal costs involved with that? And then that might raise the question of, is there a third way? Is there a way you can be influential inside the organization to accomplish things that you think would increase the likelihood of good outcomes? I mean, these are the things I might ask them to consider, but it's hard to advise someone what to do-- but I think we all face some version of that in our own lives.”

Ifeoma Ozoma said workers can make change inside companies, if they are willing to organize. “Where folks have more power than I think they recognize is a team like the site reliability engineers, who have actual control over the way that the platform functions and whether it functions or not,” she said. “And that's where I think if there were more collective action-- if a team like that, or even a large proportion of the team decided to go on strike-- that would be the type of thing that would get the attention of company leaders. If the sales team, or a large vertical within the sales team decided to go on strike, that would be the type of thing that I think would change things in a way that we haven't seen from all of the hearings, from all of the lawsuits and whatever else that's taking place externally.” The resource site she developed, the Tech Worker Handbook, offers practical advice for employees considering their options.

  1. Ultimately, government must act to relieve the ethical burden.

It is past time for Congress and regulators to take action in the public interest. Just as guardrails such as safety regulations or requirements to mitigate pollution help to relieve the ethical burden for any number of industries that produce externalities, clear rules could make things better for Facebook.

“We need regulatory solutions,” said Johnny Mathias. “We need action to actually empower folks and to make it more of a tenable place to work. We need whistleblower protections. We need a regulatory environment that supports the decisions that are being made there.”

People who want to work at ethical tech firms and use ethical tech products should consider what they can do to advocate for new laws, and to educate others about these issues.

“I think that if people were pushing their members of Congress as hard as Facebook's 15 lobbying teams both internally and externally were, then maybe we would have more progress" from lawmakers, said Ifeoma Ozoma. "Those folks are all being funded by Facebook, even though they're sitting up there on either side of the aisle, pretending to actually care about regulating the company. Those folks all have constituents who could also be handling them, but their constituents aren't.”

- - -

Many of the people who must weigh whether to stay at Facebook in the post-whistleblower age are highly skilled, and have many options. “If you have the freedom to go somewhere else or do something else, then you should do that,” said Ifeoma Ozoma. But the key thing, no matter what, is to be honest with yourself and those around you. “If you're honest about what's going on, then that's about all that we can ask of rank and file workers anywhere,” she said.

As for Facebook’s leadership? “Every company faces these moments of existential threat and crisis,” said Don Heider. “What, to me, a good CEO does-- a good management, a good board-- they step back and they say, ‘Oh my god, we really have to take a moment of reckoning and think about what we're doing, and can we continue it, and what course corrections can we take now at this crucial moment to get us back on course.”

That prescription is quite a contrast to the company’s reaction to the past two months of revelations. In his remarks during Facebook’s third quarter earnings call, Mark Zuckerberg called the media scrutiny “a coordinated effort to selectively use leaked documents to paint a false picture of our company.” In his mind, he remains-- like many an emperor before him-- above reproach.


Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...