Home

Donate

The Sunday Show: Facebook's Legal Woes

Justin Hendrix / May 1, 2022

Subscribe to the Tech Policy Press podcast with your favorite service.

If you take the time to look at the SEC filings for Meta Platforms, Inc., the company that operates Facebook, Instagram and WhatsApp, you will find various disclosures about its ongoing legal battles.

One major source of legal trouble for Facebook is the 2018 Cambridge Analytica scandal, which was exposed in March that year. The SEC filing states that "beginning on March 20, 2018, multiple putative class actions and derivative actions were filed in state and federal courts in the United States and elsewhere against us and certain of our directors and officers alleging violations of securities laws, breach of fiduciary duties, and other causes of action in connection with our platform and user data practices as well as the misuse of certain data by a developer that shared such data with third parties in violation of our terms and policies, and seeking unspecified damages and injunctive relief."

Another area at issue is around competition. The SEC filing notes that the company is "subject to various litigation and government inquiries and investigations, formal or informal, by competition authorities in the United States, Europe, and other jurisdictions. Such investigations, inquiries, and lawsuits concern, among other things, our business practices in the areas of social networking or social media services, digital advertising, and/or mobile or online applications, as well as our acquisition." That all adds up to significant antitrust concerns.

To get an update on some of the key cases under consideration, I spoke with one particularly keen observer of Meta: Jason Kint, the CEO of Digital Content Next. We spoke about the allegations in seven cases in particular:

  1. The Federal Trade Commission's suit against Facebook that scrutinizes its acquisitions;
  2. A U.S. District Court for the Northern District of California antitrust case that argues Facebook was deceptive about its privacy practices;
  3. Another Northern District case that alleges Facebook misled advertisers on reach;
  4. A suit brought by pension funds in Delaware related to Facebook's handling of the Cambridge Analytica scandal;
  5. A case brought by the Washington, D.C. attorney general also related to the Cambridge Analytica scandal;
  6. Another case brought in California related to data privacy concerns;
  7. A suit brought by Rohingya refugees from Myanmar seeking damages of $150 billion over allegations related to Facebook's role in the genocide there.

What follows is a lightly edited transcript of our discussion.

Justin Hendrix:

Jason, what is Digital Content Next?

Jason Kint:

Digital Content Next is a trade association of premium publishers. We do research, advocacy, policy work, and pull together executives in the publisher industry to learn from each other. Our focus and remit is entirely on the future of the digital content creation industry.

Justin Hendrix:

So it's safe to say that you are a chief critic of Facebook, of Meta as a company. You're essentially one of the people I think of as following its travails most closely, you're very active on Twitter where you are a tormentor of Facebook, I think of in some cases. How do you think about your relationship with Facebook as a company with regard to your role and what DCN does?

Jason Kint:

Yeah, it's interesting to hear it described that way. It certainly has become an increasingly large concern, the company and its outsized role. We've never seen a company like Facebook in the history of media, let alone the internet, with the reach and the impact it has and how information flows and influences. The relationship, I think, hit a key point in probably 2016 when I sent a letter to Mark Zuckerberg-- and to be fair, I also sent it to Sundar Pichai at Google-- expressing concerns. I think it was a timely memo and letter-- open letter, we published it-- to what we've seen now play out over the next six years, which is essentially, in the case of Facebook, the halo has really come off and they've given a wealth of material for us all to be concerned about. I've tried to make sure it's documented and understood, particularly the impact it has on the news and entertainment ecosystem.

Justin Hendrix:

Do you think there's a consensus in your membership or in the news and media business about Facebook's role or is it largely considered an adversary? How do you think that, more broadly, the industry looks at Facebook these days?

Jason Kint:

Yeah, I think for the industry perspective, it's more of a zero trust relationship, it's more transactional. They see the company as an important audience, particularly with Instagram, that is a requirement to do business. That plays into the antitrust concerns around the company. But you can't not have a business relationship with Facebook. You can't not recognize its importance to reaching the public, particularly younger people. It's just more of a lack of trust, it's a marketing and transactional relationship. My concern is that they play such an important role that that can't be the relationship forever. They have to have more ownership and responsibility to civil society, and they need to be held accountable for the mistakes of the past still, otherwise we can't expect anything to be different going forward.

Justin Hendrix:

One of the things that I notice you doing regularly, of course, is following very closely all of the proceedings and the cases that have sprung up around Meta, around Facebook and Instagram. I think of you as a source of endless evidentiary fragments that emerge from legal documents and government hearings and the like, and I want to talk about some of those today. In particular, we'll go through a handful of federal cases that are underway at the moment. I'm hoping to give the listener a sense of what are some of the legal issues that Facebook currently faces, as you say, in that effort by some, to pursue accountability against the company for what they regard as potential crimes. I want to start with a couple of cases that are focused on antitrust. For each one, I'm hoping that we can go through and hit some of the key points, the underlying issue in the case, where the case is in the process, and what are the possible ramifications of each one. I want to start in the realm of antitrust, and there are two cases right now that I know that you are focused on, one's in the District of Columbia, which is the Federal Trade Commission antitrust complaint. What's going on with this one?

Jason Kint:

The ruling is that it can move forward, and there was a parallel case that the state AGs also filed, which is under appeal, so let's put that aside, but the FTC case did get the green light to proceed from the judge. Facebook had an argument that the market wasn't properly defined, the social network, personal social networking market, that's in the case, but the judge said, no, this can move forward, and the claims that matter are probably around the acquisitions that Facebook made, Instagram and WhatsApp, most importantly, and a bunch of smaller companies that they either bought or buried.

Why it matters is that the FTC is seeking to break up the company. As much as the case continues to move forward, and that pressure to divest Instagram, in particular, from Facebook, that's obviously a very, very big deal and what I've told many over the last few years, I think we should all reflect on, is if we look at what's happened and the concerns around Facebook over the last, let's call it five years, and we reflect on if they actually had to compete with Instagram and how the two companies would behave or behave differently... I think that's the key question.

Justin Hendrix:

Now, as you mentioned, initially this was turned back by the court and there was a consideration that it hadn't met muster essentially to be considered. What do you think helped it across the line this time? What is it that made the court look at this complaint differently in the second round?

Jason Kint:

In the initial filing, they just didn't put in all the metrics and they didn't have numbers to back up the dominance of Facebook in this category of personal social networking. They included the metrics and they did from multiple angles. They said, anybody that's familiar with digital media, there's lots of different metrics that we can all debate, but they looked at, from time spent on the platform as a percentage of time spent, they looked at it as a percentage of active users, and so they kind of hit it from I think, three different angles. In all cases, it was pretty hard to argue that Facebook isn't dominant, and so that answered what seemed to be the big failure in the first filing and the judge allowed them to fix that and add the data to back it up.

Justin Hendrix:

Let's move a little, I guess, west, and look at another antitrust case that's under consideration, this time in Northern California, what's at stake here?

Jason Kint:

That's Northern District, California, it's a federal case, and that also is moving forward. It's a very similar case to what the FTC filed, but I think the two dimensions that are probably most important and could get very uncomfortable for Facebook is the consumer plaintiffs in the case really lean into Facebook's abuse of privacy and data. If anybody's familiar with the research that Dina Srinivasan did around Facebook and antitrust, it echoes that quite a bit in terms of, you have a social networking platform, which at the very beginning made privacy its number one concern for its audience. Then as it got bigger and grew and grew, and became dominant, it really abused that trust, is the allegation at least, that it abused that trust over time and really exploited the user's data because it could get away with it. That element of data and privacy integrated with antitrust, which is really becoming a global discussion and concern, is really front and center in this case.

The second element of the case, which I also would highlight, is the advertising plaintiffs, there's a set of advertiser plaintiffs in the case too. Their allegations include the Jedi Blue allegations for allegedly colluding with Google around the project, Jedi Blue, it's called- and I'm sure the listener can look it up and read more. You have the two most dominant players in the advertising marketplace having a secret deal in which, or at least the elements of the deal were not widely known that affected the marketplace and potentially rigged it. That's also in that case and should be very uncomfortable because it ultimately is a civil lawsuit, but when you get into market rigging, you can get into section one criminal allegations too, when you've got executives signing that deal.

Justin Hendrix:

Potential criminal ramifications in this particular case.

Jason Kint:

Absolutely, yeah. Those same allegations are in the Google antitrust lawsuit that was filed by the state AGs, led by Texas in which, at least reporting has said, that the just department also may take up. The discovery we've seen on that deal between Facebook and Google is pretty eye opening, let's just say, especially what's come out in the Google case because you've got executives at both companies who seem to understand the motivations behind the deal too, which was, the way it reads from the internal emails, it to keep Facebook out of competing with Google in return for other benefits.

Jason Kint:

Back to the Facebook antitrust suit. It also documents a pattern of behavior, and it goes through, I think, three different examples, Netflix, Foursquare, eBay, and then Google. So, four examples, where in various categories of business, where Facebook could be a threat, Netflix in the case of video, as they were rolling out Facebook video, Foursquare regarding location placement, Google regarding ad tech, Facebook was a threat if they moved into those businesses. Instead they did a deal, according to allegations where they said, okay, we won't get into this business, but in return you will give us reciprocal data that will help make our advertising targeting business even stronger and strengthen the moat around it. There's interesting examples across category leaders, eBay in the case of commerce. The discovery in that case should be very interesting, let's just say.

Justin Hendrix:

Antitrust cases are not known to move quickly. What can we expect in terms of the timing on either these, in terms of progress or resolution?

Jason Kint:

I think you're spot on, that's the biggest concern. In the case of the federal antitrust case, we've already started to learn a lot through discovery and the FTC case was originally filed at the end of 2020, and with the recognition, it would take a while, but it's been two years and they are moving along and they do seek to break up the company. They are very informative of both, legislation that's happening too, to try to also solve some of these concerns. Then I think probably, most importantly, that also gets missed is, it's really hard to buy other companies. Facebook buying other companies in particular categories is much more challenging when they've got these lawsuits underway, so it really does freeze them up too. I wouldn't, in any way, dismiss their impact. I think they're hugely important at this point, and they'll roll into deeper discovery. The FTC case will now, and I think the plan is to actually have it in the court trying the case next year or so.

Justin Hendrix:

Let's stay in the Northern District of California and look at another case, one around potential fraud, the idea that Facebook may have inflated its reach. What's happening in this case?

Jason Kint:

That case is now four years old, and to the point of these things taking time, but the judge just certified the class on it, so it is also moving forward. I think there's probably a decent chance Facebook will do whatever it can to settle the case, particularly because of the addition of the fraud allegation part of the complaint. That's a case for the listeners in which they may have heard about, where Facebook's potential reach as presented to advertisers was inflated, and it became obvious because it got larger than the actual census population when you broke it down, the U.S. census population, so obviously they can't reach more, you can call it 18 to 34 year old men, than actually exist in the U.S. As it went through discovery, there were pretty concerning emails that backed up the case that Facebook, at least the allegations, that Facebook knew about this issue and chose not to disclose it.

Facebook's argument seems to be that advertisers don't buy that way, that even when you go in there and you say, ‘I want to target this demographic, and this location, and these types of people,’ that the number that Facebook presents to the buyer, isn't how they actually buy the advertising, that they buy on a click or on a purchase, but the data underneath that doesn't support it. And frankly, if you talk to people that spend money on advertising at the very high level and strategically that say, we're going to move this, especially at the big agency holding companies, and we're going to go spend hundreds of millions of dollars on this platform, versus this platform, the actual number of people, and the breakdown the number of people, do matter when you're making those decisions ahead of time in planning. So it'll be very interesting to follow the case.

From a publisher perspective, putting on my publisher hat, it's the second case I can recall where there's evidence that Facebook's metrics were inflated, and then they chose not to tell the public about it. When I make a point about the lack of trust in the company, these cases really matter because it's one thing to, especially when you're moving as fast as a company and as big as a company like Facebook, to make mistakes or have issues with your metrics because you're grading your own homework. But, when you find out that you've made a mistake and then you don't correct the record, how you act after you make that mistake and discover it, that really matters to the trust of the industry and whether or not you really can grade your own homework, for sure. So, be watching that one.

Justin Hendrix:

What do you think in terms of timing on this one? How long can we expect it to go? You mentioned that it's been underway for some time.

Jason Kint:

Well, it's a milestone that the class has now been certified and the discovery on it's been very interesting. Now I think settlement talks probably accelerate, and so if they don't settle, which I would bet that they will, but if they don't, then I think later in the fall, later in this year, is when that one's expected to start to move into the court.

Justin Hendrix:

Let's come back east and go to Delaware where there is a lawsuit that I think joins together multiple plaintiffs, looking at essentially what could be a coverup of the company's involvement in the Cambridge Analytica scandal. What's at stake in this one?

Jason Kint:

That's in Delaware, that's the shareholder lawsuit. It does stem from the original Cambridge Analytica revelations in 2018. It's filed by pension funds, including one of the largest in the world, California teachers, and I think what matters for that lawsuit and what's at stake for Facebook is, one it's a result of, these are pension funds that I don't think likely are to pursue a case unless they think it's worth the time and the money, and they've got enormous resources obviously to take on a case. We talk about institutional money that can take on a company like Facebook, they probably have more resources than maybe even our federal government in that case. They sued to inspect the board of directors communications at Facebook, after Facebook settled with the FTC for $5 billion and with the SEC for a $100 million dollars, which happened and it was announced on the same day in 2019.

They sued because, at least the original question that they asked was, why would Facebook pay $5 billion? There'd never been a settlement anywhere close to that level before and Mark Zuckerberg in particular, but Sheryl Sandberg also, had avoided being deposed and discovery. There were a lot of questions in that case around how long Facebook knew about the Cambridge Analytic issues and at the same time executives and board members were exploiting, either making money off the stock or also their influence and their unique roles being on the board of directors. While there was clearly an underlying issue with the way Facebook treated data. They inspected the documents and then that ultimately led to a derivative suit, which they're now suing for a variety of issues, the company, based on what they were able to see from those documents, which the public hasn't seen, but clearly there was something in those documents that made them feel like they have a good case.

Justin Hendrix:

Let's stay with Cambridge Analytica for a second, there are a couple more cases I want to talk about, one in DC Superior Court, one again in the Northern District, California, both related to consumer protection. Can we start with DC?

Jason Kint:

Yeah, sure. That's the Attorney General of DC, Racine, and that is a Cambridge Analytica lawsuit. It has a lot of the same elements as the Delaware shareholder suit in terms of when did Facebook know that they had an issue of their data being used in ways that, certainly the user wouldn't have expected, and was against what Facebook claimed was their policy. If you have apps that were out there, and I think in the case of, I should say this with Cambridge Analytica, it's too often that I think the discussion goes immediately towards the 2016 election and political debates. More material, is that there was this, at least from the business perspective for Facebook, there was a third party that was able to siphon off and harvest significant amounts of personal data and then sell it to Cambridge Analytica and in clear violation of the user's expectations and Facebook's reported terms of use.

The DC case that you're referencing starts to get into discovery and actually there was a deposition that was originally ordered of Mark Zuckerberg to find out, again, when did they know that they had an issue and the timeline of how they dealt with it, because Cambridge Analytica was just one example, and there were many other apps that had the same access to that data and could have abused it. What's interesting, when you ask me about DC versus California, is the DC case seemed to be moving along at a pretty healthy clip and it's actually where we learned, through some discovery early on, that there was a few dozen employees, it appeared, at Facebook that knew that there were problems with Cambridge Analytica back in the fall of 2015, so many months before the press even started to report on issues.

There was interesting discovery that was happening, but a new judge was assigned to the case a couple months ago and it seems to have turned in Facebook's favor quite significantly because the new judge seems to be shutting down discovery. In Facebook's narrative of, this thing's been going on too long, at least the arguments that their lawyers have been making seems to have, I hesitate to say snowed, but the new judge seems to be wanting to end the discovery process, which if you followed it over the last few years would be incredibly premature, from my mind, because Facebook has resisted most of the discovery in the case.

I think the DC case is very similar to the Northern District, California case. The difference is the new judge really shutting down the DC discovery, where in Northern California, where really the same efforts by Facebook's law firm to halt discovery, halt depositions of Mark Zuckerberg and Sheryl Sandberg have really been called out by the judge, just in the last couple months. I don't know I've ever heard a judge scorch the defendant's law firm, Facebook's law firm, quite like happened a few months ago. The judge has now invited the plaintiffs to file for sanctions and they have, and Facebook's been called out very clearly for discovery abuse over the last four years and it's accelerating in a pretty interesting way, where there's, probably three or four different elements of discovery that clearly had Facebook super uncomfortable, that are going to move forward in the next 60 to 90 days.

Justin Hendrix:

I happened to listen to that hearing where the judge excoriated Facebook's lawyers for their behavior in the case to date. I kind of agree with your characterization, it definitely was not a comfortable conversation.

Jason Kint:

No, it was not. Importantly, the judge ordered that Facebook have a senior executive at all discovery mediation sessions going forward that could make decisions on the spot regarding discovery. Even said, if you need to run those decisions up the flagpole, then you need to bring all those decision makers with you, to the hearing, including Mr. Zuckerberg himself. It seems to be past the patience and limits of Facebook's avoidance of discovery.

Justin Hendrix:

Let's move on now, to what I think of as one of the most interesting, and to my mind, as not a lawyer, confounding cases that will probably get dealt with in the year ahead. This is another case in Northern California, focused on Myanmar, what's going on with this one?

Jason Kint:

That was a case that was filed at the end of last year, anonymously I think under Jane Doe, but it's for the UN's documentation of the genocide that happened and Facebook's– the allegations are at least of Facebook's role– which the UN covered in their report, in their amplification of hate speech across their platform and in failing to take down particular posts that were problematic. It's a complex case because it's on behalf of citizens abroad, it's a complex case because it involves citizens abroad, as the plaintiffs that is in the California courts. I think it's filed under Myanmar, Burmese law. Then we've got, elements in the U.S. law like CDA 230, which your audience probably knows, which at least it acts as a shield against liability for things that happen on Facebook's platform.

Justin Hendrix:

You mentioned that UN human rights experts investigating the genocide in Myanmar spotted Facebook's role or cited Facebook's role in spreading hate speech there. Facebook's own investigation into the situation essentially found fault with the company's practices, as I understand it, and yet this case is interesting to me, not perhaps because it is unlikely that there was a role or perhaps that there should be some compensation of some sort or some liability, but more for the legal mechanics underneath it, how does it work to essentially try to assert a foreign nation's legal context in an American court?

Jason Kint:

I think we're about to find out. It was filed abroad too, I think, outside the U.S. for citizens that live in Europe. What probably ends up mattering more, because you're in a legal dispute that is outside my elements by far, is the public discussion around it also, because it involves genocide, it's a $150 billion lawsuit, if I recall correctly, before you even get to punitive damages. It got a lot of attention when it was filed, and to your point, as much as there was internal awareness and knowledge of the issues, then it really does further the need for the discussion around the role of Facebook in providing amplification, velocity, reach, to posts. It's one thing to have these posts exist on their platform, users can post whatever they want on the platform within their rules, and if they know about it, they take it down, we can have that discussion.

This is, I think, more about Facebook taking those posts and accelerating them, and targeting them, and providing amplification to other people, and what their role and liability is there. That conversation is happening in a variety of places, courts and also parliaments, and U.S. congress. I think this case just brings more attention to the issue, and Facebook's role as a platform that reaches billions of people that can actually have consequences that are as real as genocide or the war right now in Ukraine and or insurrections. There's a recognition of just how powerful this company is and when you tie that back to some of our other discussions around governance and responsibility, do you want to have one person ultimately that has that much control and power, in Mark Zuckerberg?

Justin Hendrix:

One of the things that I note in this complaint, around Myanmar, is the reference to the whistle blowers, to Frances Haugen, to Sophie Zhang. What effect do you think that these most recent whistleblowers have had on any of the legal action against Facebook?

Jason Kint:

Some of the allegations were already in these cases, or cases that were filed before these whistle blowers came out, and some of the evidence was coming out on discovery, but to have actual whistle blowers, bring that evidence directly to the public through the press or through filings with the SEC, in the case of Frances Haugen, I think just brings further confidence to the problems in the cases. We talked about the inflated reach issue and I believe one of Frances Haugen's whistleblower complaints to the SEC included the inflated reach metric. Particularly with the SEC, what's inching, and I'm going to be watching with all these cases is, Facebook for a long time has, if you look read through their risks as a company, these risks are identified as potential risks and our platform's really big it could cause problems in the world or data and privacy could, if we had a big breach, it could cause problems for our underlying business.

A lot of these cases involve things where that risk was actually real, it was already known by Facebook, and versus just a hypothetical or potential. When you learn that over time, they've been saying that these are risks that could happen, but they actually knew maybe that they were happening and they were just using PR to try to deny or deflect it, that becomes a real issue too, in terms of the SEC and risks to the company. We can't forget that they had the largest drop in the history of the stock market ever, just a few months ago. These risks are real.

Justin Hendrix:

This document in particular, I would recommend listeners look at, is a kind of extraordinary account of Facebook's role in Myanmar, a sort of tick tock of the case and the facts specific to it are laid out. You have both the broader context of Facebook's role in Myanmar, with the complaint concluding that, “Facebook's admissions that it should have done more to prevent the genocide in Burma and its subsequent efforts, if any, came too late for the tens of thousands of Rohingya who have been murdered, raped, and tortured, and for the hundreds of thousands who are now living in squalid refugee camps and displaced from their homes across the world.”

I should say that language comes just after the complaint describes Facebook's own belated acceptance, that it had played a role in these events, and then the facts specific to the Jane Doe that you mentioned, the plaintiff, are really something. 16 years old, her father detained, beaten, tortured, for two weeks by the Myanmar military, and then of course, she is forced to flee the country, fearing that she'd be abducted, or sexually assaulted, or even killed. Her account of how she made it across Bangladesh, and Thailand, and Malaysia, and what ultimately came of her life, now here in the United States, really an extraordinary tale almost zooming out first to the broader geopolitical context, and then zooming in to the suffering of this one individual.

Jason Kint:

Yeah, that's the human element of... Each of these cases, some of them hit on the business concerns, some of them hit on data and consumer protection concerns, but that case brings to light the real civil society impact. And, yeah, wow.

Justin Hendrix:

Jason, I do understand that you look at these things out of a professional interest, that you are following these things very closely, you've been involved in digital media and advertising for close to three decades. Is there something deeper here in this for you?

Jason Kint:

That's a good question. I think there is no doubt that this affects the way news entertainment and our publisher members content is both distributed and monetized, but one company and the power it has, and too much power it has, over the world is certainly clearly problematic. The evidence– I think with all these cases, and it's hard to follow them all– I understand that each case has its own complexities, but there are common themes that connect them all. Being able to understand those, I think, is critically important to understanding the consequences and the importance of change at Facebook.

All the cases, all seem to document this issue of a company that has built an incredibly powerful platform, at the same time has not kept up its own responsibility for that platform, and its consequences, and the timeline in which they were aware of issues, whether it be inflated numbers or data abuse that could be misused by very powerful political actors, to hate speech and genocide, the timeline of awareness and what they did about it, while in parallel creating enormous wealth at the company, I think becomes a very clear issue. I think it's why the SEC's involvement on some of these cases becomes really, really important, frankly, because if you're treating risk to a company as hypothetical, while at the same time you're benefiting from it and creating enormous wealth, then I think that needs to be dealt with and probably uniquely the SEC is the place to do that.

Justin Hendrix:

The Frances Haugen papers really put the spotlight on Facebook's leadership ultimately being the kind of backstop to a lot of these questions. I wonder if you see, in these documents, a similar concern about a handful of individuals, ultimately at the top of the company, the culture they've created, people like, Sheryl Sandberg, but also Mark Zuckerberg.

Jason Kint:

Yeah, absolutely. I think it's why the Delaware shareholder case is the one that's, to me, most interesting– because it most closely documents the roles of the individuals, both in the decisions that were made... Sheryl Sandberg's a very interesting leader in that she's only testified, I think once that I'm aware of, and in the case of her hearing in 2018, at least the New York Times reported that, there was a number of areas that were off limits that she didn't even have to answer questions on at the hearing. She plays a kind of unique role as Mark Zuckerberg's partner in crime– I use that with a big question mark next to it. If you look at some of these allegations and ‘how much have they protected each other’ is a big question mark. Then the board members that were very, very close to Mark, ultimately the case kind of documents how in the core of the Delaware cases that expressed that it was futile to even go to the board about the issues, because ultimately Mark controls the board, controls the main committees, and the company. It's one person that's in charge of everything.

Justin Hendrix:

In my mind, I try to hold that one person in conceptually next to this Jane Doe and to make sure that I keep both those people in mind as equals. This company, to my view, I think your members are right to have little trust. I would argue perhaps that governments, citizens, should have little trust at this stage.

Jason Kint:

I think that’s fair. I think it's been evident for a few years. When you look at the exploration of new platforms, or new places that they may go, whether it be video systems to communicate with each other, or virtual worlds that they fund and create, you have to go back to the pattern of behavior that's been documented and the same leaders that are still responsible and ask yourself why on Earth you'd expect anything different going forward, to give them that much control over future ways in which we communicate across society. I think the trust factor is, and the lack of it is real, and I don't think that can be repaired by the same governance and leadership still being in place.

Justin Hendrix:

I should hope that perhaps there will be some accountability, either through these cases or through other means as this information comes to light. I appreciate you for keeping us up with all the details and hope you'll continue to do so.

Jason Kint:

Thank you for having me, Justin, thank you for your work.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics