Addictive Technologies Conference Tackles Social Media’s Impact On Children, Online Safety

Gabby Miller / Apr 16, 2024

A young child starting at a smart device.

Last Friday at Seton Hall University School of Law, attorneys and academics gathered for a conference on “legal responses to addictive technologies.” The day-long gathering zeroed in on the impact of screens, social networks, and online games on kids, with experts exchanging ideas and approaches to legislation and litigation meant to address these harms.

New Jersey’s Attorney General, Matthew Platkin, kicked off the morning with a keynote address on the fight to protect children by holding social media companies accountable. Calling this issue “the cause of a generation,” he honed in on how Meta allegedly refused to disclose or meaningfully mitigate harms to users under 13, in violation of the Children's Online Privacy Protection Rule of 1995 (COPPA). He also drew attention to the ways Meta approached children it knew to be on its platforms (the company called them a “valuable but untapped audience”), all while suppressing research that indicated the platform was unsafe for these young users.

Platkin is part of a bipartisan coalition of attorneys general across 41 states and Washington, DC that filed suit against Meta last October. The 233-page filing alleges that the tech company designed and deployed features on Facebook and Instagram that encouraged addictive behaviors it knew to be harmful to its young users’ mental and physical health. Much of the underlying evidence came from documents leaked by former Facebook employee Frances Haugen. Platkin told the conference goers that Meta is only the first step in holding “bad actors” accountable, and the same coalition is actively investigating TikTok over similar conduct.

Other litigation working its way through the courts involves hundreds of school districts and individual plaintiffs suing social media companies Meta (Instagram and Facebook), Snapchat, TikTok, and Google (YouTube) over their products’ designs and its potential impacts on the youth mental health crisis. The cases were consolidated in both federal and California state courts respectively in late 2022, but test case submissions are currently under consideration for the bellwether trials, or a process for resolving large-scale litigation that serves as a predictor of outcomes in similar cases.

One attorney helping lead this effort is the Social Media Victims’ Law Center’s Laura Marquez-Garrett. In a talk on the ways some of their clients were allegedly affected by social media platforms and their products’ designs, Marquez-Garrett highlighted particularly egregious accounts of young users who were victims of sexual exploitation online, repeatedly exposed to self-harm or violative content, and in extreme cases, committed suicide after this exposure. Their 15-minute presentation was met repeatedly with audible gasps from the audience.

Marquez-Garrett also focused on Snapchat’s “Quick Add” feature, which allegedly facilitates the ability for drug dealers to connect with underage users and recommends these connections. This demonstrates that not only are social media platforms’ products defective, but their programming is too, according to Marquez-Garrett. They also told the audience that rather than asking children about their social media use, they should inquire about what kinds of content they’re being regularly exposed to. “Ask them [kids], have you ever been recommended to a drug dealer? Have you ever seen – here's one of the shockers – a live suicide or murder? I now ask children that question when I do interviews, and you would be horrified at the number of children who say yes,” they said.

During a presentation on regulatory approaches to platform design, Fordham University School of Law professor Zephyr Teachout made the case that while the attorneys general suits and ongoing private litigation is good for establishing standards, it’s “absolutely insufficient” for making the internet wholly safe for children. While the courts may be able to recognize that a social media platform’s design features are not speech, this “critical” litigation is “not the end, but it is an essential piece,” Teachout said.

Other panelists throughout the day pushed back on certain legislative approaches typically favored by big tech companies and their allies that shift the responsibility onto parents to monitor their child’s social media use. In a panel on “online harms,” Corbin Evans, Senior Director of Federal and Government Affairs at the American Psychological Association, characterized these efforts as a “wild misdirection strategy” by tech companies. “We think it's a nonsensical, non-starter of a solution.” While Evans does believe there should be some parental controls available on social media platforms, “under no circumstances” is this an effective mitigation strategy for preventing the harms discussed at the conference. “Instead, we think it is incredibly vital and important to ensure that the responsibility is continually pushed back onto the tech companies themselves, and that these companies take responsibility to change their products,” he added.

Gaia Bernstein, technology privacy and policy professor of law at Seton Hall University School of Law and the organizer of the conference, said tech firms want to shift the blame for online harms onto parents. “Why do tech companies prefer to give parents roles?,” she asked. This “parental responsibility” strategy not only transfers the responsibility – and blame – to parents, but is also ineffective and causes harm to consumers, Bernstein explained. “The argument is, ‘You chose to do this, you are responsible.’”

However, this individualistic understanding of choice reflects a “very poor, impoverished understanding of autonomy,” argued Elettra Bietti, an assistant law professor at Northeastern School of Law. In her presentation on the “Regulating Addictive Design” panel, Bietti challenged the “binary understanding of the human subject as autonomous to the extent they're not controlled by someone else.” Instead, while considering how different infrastructural pressures and dynamics shape human behavior and attention, she offered an understanding of how freedom is constructed. This shift may help overcome “some of the pushback that comes from the First Amendment fields, for example, that tell us you cannot regulate platforms because that is an interference with individual autonomy,” Bietti said.

Concerns over how child online safety legislation and litigation may affect freedom of expression was an underlying theme for much of the day. In the New Jersey Attorney General’s opening address, Platkin said he considers any attempts to make this conversation about restricting free speech a red herring. “There have always been and always will be limits on what the First Amendment says and where it applies. And if you're engaging in deceptive, unconscionable business practices, you can't lie to your consumers.”

There was also some hesitancy in responding to the “science wars” that have broken out in recent weeks surrounding the launch of social psychologist Jonathan Haidt’s new book, “The Anxious Generation,” and a scathing takedown of it in the journal Nature. The debate, which centers around whether social media can be blamed for an “epidemic” of teen mental illness, is “an incredibly complex equation,” the APA’s Evans said. While the Association might draw the line on causal claims “a little bit shorter than what Dr. Haidt included in his recent publication,” he believes that falling on one side of the spectrum versus the other is “not contributing to a furthering of a conversation that something needs to be done.” In Nature, University of California, Irvine Associate Dean and Professor Candice Odgers similarly argues that “rising hysteria” around whether tech companies have “‘rewired childhood and changed human development’” by “‘designing a firehouse of addictive content’” as Haidt put it, is a distraction from tackling the real causes of the rise in teenage mental illness.

Related Reading from Tech Policy Press:


Gabby Miller
Gabby Miller is a staff writer at Tech Policy Press. She was previously a senior reporting fellow at the Tow Center for Digital Journalism, where she used investigative techniques to uncover the ways Big Tech companies invested in the news industry to advance their own policy interests. She’s an alu...