Home

Donate
Analysis

What Does the First US Social Media Addiction Trial Mean for the Tech Industry?

Varsha Bansal / Feb 8, 2026

Varsha Bansal is a fellow at Tech Policy Press.

Parents of children who have died due to social media harms hold a vigil for their kids on Thursday, February 5, 2026 at the Los Angeles Superior Courthouse, ahead of the landmark social media addiction trial set to begin next week. (Jordan Strauss/AP Content Services for ParentsTogether Action)

Is social media addictive? And if so, can social media companies be held responsible? This and more will be discussed in the coming weeks as opening statements and arguments are made by lawyers in the first social media addiction trial expected to start next week in the Los Angeles Superior Court.

The case before the court, filed by 20-year-old woman who goes by the name K.G.M. and her mother against Meta, YouTube, Snap and TikTok, alleges that these social media platforms harmed her through their algorithms and design. It’s the first of a series of bellwether cases to have been selected from a large pool of complaints filed by approximately 1,600 plaintiffs from all over the country — including hundreds of families and school districts — consolidated in what’s known as a Judicial Council Coordination Proceeding, or JCCP. These trials are chosen as representative test cases to help determine verdicts for the remaining plaintiffs. Snap and TikTok settled with K.G.M. — which means the upcoming trial will only have Meta and Google as defendants.

“This trial tests whether social media companies can be held liable for the way their platforms are designed, not just for third-party content,” Mofe Koya, an associate member of the University of Cincinnati Law Review, told Tech Policy Press. “If plaintiffs succeed, it would seriously weaken the protective scope of Section 230 by carving out product-design and conduct-based claims.”

Section 230 of the Communications Decency Act offers immunity to social media platforms against most third-party content posted on their platform. But these trials are unique because the plaintiff arguments are not so much focused on who posted the content, but rather how these companies have allegedly built addictive products with features such as “infinite scroll” and “autoplay” — thereby causing harm to the users.

And typically, Section 230 is used to kill such cases against social media companies early on. But for this one, the judge said that it’s unclear whether it was the third party content or the alleged design defects that keep the kids hooked. A jury will decide.

Stakes are high for social media companies, say legal experts. “That's because a series of verdicts in the plaintiffs' favor that are upheld on appeal would fundamentally change the way social media platforms deliver content to minors,” said Clay Calvert, a nonresident senior fellow in technology policy studies at think tank American Enterprise Institute, who has been following the cases very closely. “So that would be big.”

That said, this trial wouldn’t mean that social media companies would become liable for “all user-generated content,” said Daryl Lim, a law professor at Penn State University. “The narrower significance is whether courts will treat addictive or unsafe design as actionable conduct distinct from protected content moderation or publication.”

Evidence Exhibit

Internal Facebook Messages

In re: Social Media Adolescent Addiction MDL
Case No. 4:22-md-03047-YGR • N.D. California
PDF View Source Document
██ ██ — 9/10/2020, 9:13 AM PDT
oh my gosh yall IG is a drug
████████ — 9/10/2020, 9:14 AM PDT
Lol, I mean, all social media. We're basically pushers.
██ ██ — 9/10/2020, 9:15 AM PDT
Seriously it is! We are causing Reward Deficit Disorder bc people are binging on IG so much they can't feel reward anymore...like their reward tolerance is so high
████████ — 9/10/2020, 9:16 AM PDT
Yeah, I was starting to think the same thing yesterday when you made that gambling reference. It's kind of scary 😟
Source: CourtListener • Document 2648-39 • Internal Facebook/Meta communication

What to watch out for in the upcoming trial?

One of the things to watch out for is how the court frames the conduct vs. content distinction. If the judge accepts that design features are independent conduct, Section 230 defenses will be weakened. This will also depend on whether the plaintiffs are able to tie specific design features to concrete harm, especially in minors.

The second thing to look out for is the impact of internal documents that show that companies knew about harm but prioritized engagement anyway. A massive trove of unsealed documents emerged before the trial, with more expected to come out during the proceedings in the coming weeks. The JCCP has overlapping legal teams and a shared evidence base with federal multi-district litigation that will advance in court later this year.

“When you are able to see with your own eyes the exchange between Instagram engineers joking about how they are basically ‘drug pushers LOL’ and how Instagram is a drug and they make references to gambling and references to how they are basically Big Tobacco because they are burying evidence that proves that their product is addictive — that is very powerful,” said Sacha Haworth from the Tech Oversight Project, non-profit watchdog group. “That will have an enormous impact if presented at trial.”

For Julianna Arnold, whose 17-year-old daughter Coco died of fentanyl poisoning from the pills sold to her by an older man she met on Instagram, this trial is a crucial moment of transparency.

“We're really looking for the truth to come out, not these canned statements that they [companies] make to the media or the pre-prepared testimonies they create for congressional hearings and the false answers that they provide to legislators,” Arnold told Tech Policy Press. “We finally want people to see the evidence and internal research and connect that with the real world harms that our children experienced, in a hope that we will force these companies and pass legislation to make them design their products safely for young people.”

Another thing that will draw attention to the court is the testimony of current or former employees of tech platforms. Plaintiffs’ lawyers will press them on the extent to which they were aware of the effect of their algorithms and business model on children and adolescents. Notably, Meta founder and CEO Mark Zuckerberg is expected to take the stand in the coming weeks.

“All the jurors, I would suspect, are going to have heard of Mark Zuckerberg, so his credibility will be very important in terms of potentially influencing the outcome of the case,” said Calvert. “He's just one witness, but obviously he's a big witness.”

Evidence Exhibit

Teen Mental Health: Creatures of Habit

Internal Instagram Presentation, 2019
In re: Social Media Adolescent Addiction MDL
Case No. 4:22-md-03047-YGR • N.D. California
PDF View Source Document
Instagram sets the standards not only for how teens should look and act but also for how they should think and feel.
Teens feel themselves to be at the forefront of new social behaviours to which there is no consensus on how to behave or cope. They sorely lack empathetic voices to whom they can turn for support.
Teens talk of Instagram in terms of an ‘addicts narrative’ spending too much time indulging in a compulsive behaviour that they know is negative but feel powerless to resist.
Source: CourtListener • Document 2648-40 • Internal Instagram/Meta research

The plaintiff side is expected to make arguments about the evidence that has come forward during discovery, demonstrating that leaders and employees who were researching these issues at these companies had information about the effects of their decision with regard to design features like infinite scroll and autoplay and the effects they have on kids. They are expected to make a comparison with other addictive products, such as cigarettes, claiming that the companies in this case as well knew of the addictiveness of their products and how their designs resulted in mental health harms for children and young users.

“The Big Tobacco analogy is rhetorically powerful,” said Lim. “But courts may be more persuaded by concrete evidence about platform features, defaults, and foreseeable risks than by broad comparisons alone.”

One question the jury will have to answer is whether the social media platforms are substantially liable for the harms claimed by plaintiffs, or whether there were other circumstances that may explain a particular outcome. For instance, defense lawyers may push to determine whether K.G.M. had problems at home or pre-existing mental health issues. or if there was something else going on in her life at the time that caused her harm, rather than the platform’s design that caused it. “If it’s the design, that's going to bode well for her case,” explains Calvert. “If it's the content that she watched, then that's going to bode well for the platforms in terms of Section 230 immunity. Or was it something else altogether, and that would also bode well for the defendants in that case, that something else caused her harm.”

One argument legal experts expect the defense to deploy is grounded in Section 230, in effect that platform algorithms ultimately organize user content. Moreover, they may argue that as platforms they provide tools, and users choose to engage the way they want to. The other strong points that can come from their side are First Amendment concerns in case the courts choose to regulate expressive design — and the strongest point of all being that the social media addiction claims rely on unsettled science.

There are some weaknesses, too. Koya explains that their arguments might be too reliant in a climate that’s increasingly hostile towards Section 230. Moreover, she adds, that the internal documents undermine their “neutral platform” narrative, and she believes the courts are less persuaded that algorithms are passive.

Meanwhile, dozens of impacted parents from all around the country are slowly trickling into Los Angeles to watch the trial proceedings closely. “We lost our kids, and no one ever gave us any explanation of why this happened, you know, and how it happened,” said Arnold, who plans to be present in the courtroom every day. “This is that vindication for us to feel like our kids didn't die in vain, and they can save other children and families and change the system.”

Authors

Varsha Bansal
Varsha Bansal is an independent journalist based in Los Angeles who investigates AI companies and how the technology is reshaping workers, communities and society. With over a decade of experience as a tech reporter, her work regularly appears in WIRED, The Guardian, TIME, Fortune, MIT Technology Re...

Related

Podcast
What Carrie Goldberg Has Learned from Suing Big TechFebruary 8, 2026
Perspective
The Youth Online Safety Movement Needs to Respect Children’s AutonomyNovember 21, 2025
Analysis
Social Media Giants on Trial in California as Courts Revisit Tech ImmunityFebruary 5, 2026

Topics