Meta’s Mark Zuckerberg Faces Jury in High-Stakes Civil Trial in Los Angeles
Varsha Bansal / Feb 21, 2026Varsha Bansal is a fellow at Tech Policy Press.

Meta CEO Mark Zuckerberg arrives for a landmark trial over whether social media platforms deliberately addict and harm children, Wednesday, Feb. 18, 2026, in Los Angeles. (AP Photo/Ryan Sun)
The much anticipated testimony of Meta founder and chief executive Mark Zuckerberg took place in the Superior Court of Los Angeles on February 18. Dressed in a blue suit and gray tie, Zuckerberg walked into the courtroom to defend allegations against his company that it designs products that are addictive and harm children.
While he has defended his company in Congressional hearings in the past, including a memorable appearance before the Senate Judiciary Committee in 2024 when he was made to apologize to parents who claimed social media harmed their children, this was the first time Zuckerberg took the stand in front of a jury to answer questions in what is regarded as a landmark trial.
The plaintiff in this case is a 20-year-old woman who goes by the initials K.G.M. She filed a lawsuit against Meta, Google, Snap and TikTok alleging that the addictive features of these platforms got her hooked and contributed to her mental health issues. Zuckerberg sat in a courtroom filled with journalists and several parents who say social media harmed and took the lives of their children. The parents, who arrived at the court early in the morning to try and snag a seat in court, are among thousands of plaintiffs whose lawsuits were filed against social media companies including Meta and joined together in what is known as a Judicial Council Coordination Proceeding (JCCP).
The outcome of the trial could have implications far beyond just the firms named in the suit. The potential policy significance of this trial is two-fold, explains Dani Pinter, chief Legal Officer and director of the Law Center, National Center on Sexual Exploitation (NCOSE). If Meta and the other firms are found liable, “it will reveal the truth to the world about what Meta knows about the dangers and harms its platforms inflict on users,” she said. “And second, a finding of liability establishes a baseline for social media platforms' broader responsibilities regarding certain dangerous features, particularly those related to child users.”
Legal experts tracking the case say that there was “zero upside” for Zuckerberg testifying in court.
“The best he could do was not make their case worse,” said Eric Goldman, a law professor at Santa Clara University. Zuckerberg stuck mostly to what he has said in the past and also had some safe responses and clarifications. Among other things, he said that Meta’s products are designed for users over 13 years of age, and that he regrets not determining early on how to better identify users below 13 and remove them from his platforms.
Zuckerberg was grilled by the plaintiff’s lawyer, Mark Lanier, who at one point referred to an internal review from Meta which estimated that over four million people under 13 were using Instagram during 2015. Zuckerberg said that many kids lie about their age, and that Meta developed measures over time to detect underage users. But, Lanier said this wasn’t the case when K.G.M joined the app; at the time, she was nine.
One lawyer Tech Policy Press spoke to about the testimony was persuaded by Zuckerberg’s arguments regarding Meta’s evolving safety practices, noting he presented safety as an ongoing and evolving process, suggesting that the company took these issues seriously.
“He discussed how safety features were added over time, acknowledged that certain policies reflected an earlier era and were later abandoned, and conceded that Meta did not always get everything right immediately,” said Kimberly Pallen, a trial attorney at law firm Withers.
“This supports an argument that Meta acted reasonably by continuously improving its systems, adding more proactive tools, and revisiting prior protocols as concerns emerged.”
But for Kelly Stonelake, who worked with Meta for nearly 15 years before leaving the company and becoming a whistleblower in 2025, the courtroom testimony felt less like a departure from prior statements and more like a continuation of a well-established narrative: harm is unfortunate but not systemic, youth use is policed, safety is core, and the company is constantly improving. (Stonelake has sued Meta alleging sex discrimination, harassment and retaliation.)
“Mark suggested safety and ‘user value’ are his priorities, but the business model is driven by time spent, engagement, and thus ad revenue,” Stonelake told Tech Policy Press. “Features like infinite scroll, algorithmic amplification, and persistent notifications are engineered to increase use without consequence for the impact on vulnerable people.” She said that even though Meta has publicly stated that what’s good for users is good for the business, internally the conversations were fairly different, with the company’s growth goals often being framed in terms of “maximizing engagement, opens per day, and time spent.”
“When your revenue model depends on advertising tied to attention, there is an inherent tension between reducing time spent and maximizing revenue,” said Stonelake. “That tension rarely showed up in public testimony, but it was very real in product strategy discussions.”
But for the court to hold the company accountable, the evidence would have to go beyond claims that Meta sought user engagement. Goldman says the stack of evidence showing that Meta sought engagement wasn’t under dispute, and there is ample evidence proving that. Instead, he said, the questions here are “did Meta intentionally addict its users in a way the law recognizes” and “did those efforts result in the victims' harms in a way the law recognizes.”
“To establish those claims, the plaintiffs have to do more than simply show that Meta sought to increase user engagement,” said Goldman. “We are going to get a battle of experts debating whether the law recognizes a phenomenon called ‘social media addiction’ and, if so, how to apportion responsibility for such addiction between social media services and other actual or potential causes of the victim’s harm.”
When Zuckerberg was asked about his beliefs on whether people tend to use something more if it’s addictive, he said while that can be true, it doesn’t apply in this case.
While it’s unclear how the trial will play out, the biggest victory so far has been simply having the trial itself take place, say several people from the plaintiff’s side. This trial is the first of many bellwether trials that will be held this year — part of the consolidated case with around 1,600 plaintiffs.
The case “targets the product design itself — the algorithms, the infinite scroll, the notification systems, the engagement loops deliberately built to maximize time on platform regardless of the cost to users,” said Stonelake. “For the first time, these companies cannot hide behind Section 230 immunity,” referring to Section 230 of the Communications Decency Act, which says platforms aren’t responsible for most content posted by third parties.
Meta is also facing another trial in New Mexico, with similar allegations. These trials add to the “mountain of evidence that social media companies design for addiction” and “that executives at these companies knew the harms they were causing to young people but continually chose profits over safety,” said Josh Golin, executive director at FairPlay, an organization that works to protect kids from Big Tech. “The trial is also helping to demystify social media— it’s not inherently addictive or dangerous — the social media products young people are using are harmful because of deliberate design choices.” Golin believes that the trial will help move forward legislation at both the state and federal level, such as the Kids Online Safety Act, which would create a mechanism to hold companies accountable for their design choices.
As the trial advances in the coming weeks, experts say, Meta may seek to shift the debate towards content. But Stonelake says that it is important to remember that typical kids aren’t going to these platforms in search of harm. “I’m thinking about stories like Mason Edens, who turned to his algorithmic feed in the wake of a breakup, searching for inspirational quotes. He was fed a steady stream of pro-suicide content until he took his own life,” she said, referencing an incident involving TikTok. “While the tech company may not have uploaded the pro-suicide content, they knew what would keep a sad kid hooked and fed it to him endlessly.”
Just a day after Zuckerberg’s courtroom presence came another significant testimony, this time by a former employee, Brian Boland who spent a decade building the company’s ad machine. Boland spent time talking about how Meta’s revenue ambitions shaped product design for the company.
Cristos Goodrow, vice president of engineering for YouTube, is expected to testify in the courtroom on Monday.
Authors
