Fight Over State Child Online Safety Laws May Last YearsJustin Hendrix, Ben Lennett, Gabby Miller / Sep 18, 2023
Gabby Miller is Staff Writer, Ben Lennett is Contributing Editor, and Justin Hendrix is CEO and Editor of Tech Policy Press.
After a wave of legislation focused on child online safety swept through state legislatures over the past two years, legal challenges against the new laws are gaining traction in federal courts. But rather than signaling a change in the tide, the lawsuits may ultimately spur a new round of bills that address flaws in those passed in the first wave.
Putting aside the merits of the various approaches to child online safety that animate recent legislation and whether they may be effective, it is clear that the overarching issue is one that will survive well into the future. A recent national poll on children’s health found that use of devices and social media are at the top of parent concerns.
What follows is a summary of the legal and political debate involving child online safety laws and where it might go in the future.
Federal courts question the constitutionality of state child safety laws
Challenges to new laws are mounting in multiple states. Late last month, federal district court judges in Texas and Arkansas issued temporary injunctions against two recently passed “age verification” laws in those states, both scheduled to take effect on September 1st. And today, a federal judge in California granted a preliminary injunction following a challenge to the California Age-Appropriate Design Code Act.
Signed in June by Governor Sarah Huckabee Sanders, a Republican, the Arkansas “Social Media Safety Act” would have required a minor to seek a parent or guardian’s consent to open a social media account and further mandated the platform to verify the age of the account holder using either government ID or commercial verification technologies. Judge Timothy L. Brooks issued the injunction just hours before the law was set to take effect.
The same day, Judge David A. Ezra similarly enjoined Texas’ HB 1181, which would have restricted minors’ access to adult content online and required websites like PornHub to verify the age of its users and issue a landing page message that stated pornography “increases the demand for prostitution, child exploitation, and child pornography” and is “potentially biologically addictive.”
While the two laws differ in focus, the legal challenges mounted against them were largely the same.
NetChoice, a tech lobbying firm representing companies like Amazon, Google, TikTok, Meta, and Twitter (now X), filed suit in Arkansas to block the Social Media Safety Act (originally AR SB 66). In NetChoice v. Griffin, NetChoice argued that the act violates all online users’ – not just minors’ – First Amendment rights and is “hopelessly vague” in defining what a social media platform is, “leaving companies to guess whether they are regulated by the Act.” The law covers websites with "a substantial portion of material that may be harmful to minors.”
In his opinion, Judge Brooks largely agreed with the plaintiffs, noting that the Arkansas Act “is not targeted to address the harms it has identified” and further research is needed before creating more “narrowly tailored” regulation. “Age-gating social media platforms for adults and minors does not appear to be an effective approach when, in reality, it is the content on particular platforms that is driving the State’s true concerns,” he wrote. He further found the law to be too vague in a number of ways, including its definitions of what constitutes a social media platform. For instance, the law does not cover YouTube, even though the state apparently referenced it in an exhibit discussing the dangers kids face on social media.
In Texas, a group of pornography platforms and creators called the Free Speech Coalition mounted a legal challenge this summer (Free Speech Coalition, Inc., et al. v. Colmenero), characterizing HB 1181’s requirements as blatantly unconstitutional and violating the First Amendment rights of Texans. Judge Ezra agreed “that the state has a legitimate goal in protecting children from sexually explicit material online,” but that doing so must be done constitutionally. “A party cannot speak freely when they must first verify the age of each audience member,” the opinion states. “This has a particular chilling effect when the identity of audience members is potentially stored by third parties or the government.”
In California, NetChoice brought a suit seeking to block that state’s recently passed Age Appropriate Design Code (AADC), set to take effect next summer. The AADC takes a very different approach than the laws in Arkansas and Utah. The AADC requires age assurance – which is distinct from age verification – and new data protections for minors, but also stipulates that businesses that provide a digital media product likely to be accessed by children must “consider the best interests of children when designing, developing, and providing” it. Despite the law’s ‘safety by design’ approach, a federal judge in the Northern District had indicated concern over whether it is narrowly tailored enough to be constitutional.
In her ruling, US District Judge Beth Labson Freeman concluded that “NetChoice has demonstrated that it is likely to succeed on at least one of its First Amendment theories set forth in… [the] complaint” and that “NetChoice also… demonstrat[ed] a likelihood that it will suffer irreparable injury” if the law goes into effect.
The beginning of the end, or just the beginning?
Decisions in favor of NetChoice and others opposing child online safety bills would seem to signal the end of state efforts to limit children’s and teens’ access or exposure to certain content or to address design features and business practices deemed harmful. But more likely it is the beginning of a long legal and policy debate. Though Federal district courts found the Arkansas, Texas, and California laws unconstitutional, the states may yet appeal the decisions. And it is no certainty the appellate courts will agree with the earlier decisions.
The courts have already demonstrated a split in another ongoing legal battle over the internet and state laws. In 2021, Florida and Texas passed laws restricting the ability of social media platforms to moderate content posted by users. Tech companies challenged both laws and initially won stays from the district courts. Texas and Florida appealed, and though the Eleventh Circuit Court of Appeals upheld a ruling saying that most of the Florida law was “substantially likely” to be a violation of social media platforms’ First Amendment rights, the Fifth Circuit Court of Appeals upheld the Texas law and reversed the district court's injunction. Both cases may end up before the Supreme Court.
And there are signs that legislators will be undeterred even if round one of legal fights goes against the states.
For instance, Tech Policy Press attendeda panel discussion at last month’s National Conference of State Legislatures (NCSL) in Indianapolis, Indiana, titled “Protecting Kids on Social Media.” There, state lawmakers from different parties directed their animus not at one another, but rather at the tech industry front man who joined them on the stage. Minnesota State Representative Kristen Bahner (D-MN34B) and Utah Senator Michael McKell (R-UT25), legislators that both championed bills aimed at child online safety in their states, had it out with Carl Szabo, Vice President & General Counsel of NetChoice.
The response in the room, which was crowded with lawmakers from across the country, suggested that while Szabo and his paymasters may win in courts in the near term, lawmakers in multiple states and from both parties are already preparing for round two. But the seeming unanimity of the lawmakers against the NetChoice line of argument did perhaps obscure the reality that the laws that Minnesota Rep. Bahner and Utah Sen. McKell championed in their states are rather different.
The Minnesota law, the Minnesota Age Appropriate Design Code Act (HF 2257) is modeled after the California Age Appropriate Design Code (AADC) that passed last year, though the Minnesota version failed to pass in the previous legislative session. The legislation would have required online services “likely to be accessed by children” to conduct impact assessments around data privacy and protection, assess whether the design of the online service might contribute to children being exposed to harmful content, and consider the role of algorithms and targeted advertising in potential harms.
Following the California AADC model, the Minnesota law contains provisions similar to those targeted by NetChoice in its lawsuit filed in California’s Northern District. But Rep. Bahner was hopeful that even though the bill did not pass – due to a “minor technical issue,” as she put it – some of the legal vulnerabilities that NetChoice and others have targeted have been addressed.
In Utah, Sen. McKell backed SB0152, the Utah Social Media Regulation Act. The law, which passed in March, requires that tech firms verify the age of users; requires companies to get parental consent for a child to have a social media account; and puts other restrictions on accounts held by minors, such as prohibitions on direct messaging, advertising, and the collection of personal data. And, perhaps most controversially, the law gives parents and guardians the right to access a minor’s account, including direct messages.
These two bills generally fit the pattern observed across the dozens of child safety bills proposed in states in the past year. A Tech Policy Press snapshot in May found 144 child online safety bills spanning 43 states were introduced in 2023, noting that “in very broad strokes, Democrats have proposed laws similar to the AADC that require platforms to mitigate harms to minors, while Republicans appear to be focused on age verification measures or filtering laws, often to restrict access to pornography.”
Age verification laws often require children to get parental consent before creating a social media account. They are typically criticized for violating kids’ privacy and speech rights. Others argue that the bills are a way for Republicans to advance a conservative ideological agenda and are thus similar to book bans and “don’t say gay” campaigns lobbied in some of the same states.
Arkansas and Texas are just two of seven Republican-led states to pass some type of age verification law in recent months. Others include Louisiana (HB 77), Mississippi (SB 2346), Montana (SB 544), Utah (SB 287), and Virginia (SB 1515). A Louisiana bill, which requires age verification on websites that contain a “substantial portion” (33.3%) of adult content, has inspired a slew of copycat legislation since it was passed in 2022. The Verge recently reported that it has inspired “at least 17 copycats in state legislatures across the country with very few textual changes.”
Irene Ly, counsel for tech policy at Common Sense, an organization dedicated to helping kids and families navigate the digital world, says age verification bills maintain the status quo on platforms.
“Once a kid does get consent to go onto the platform, by and large, they are going to face the same harmful practices that they're seeing now,” Ly explained in an interview. The same data is being collected, the same ads are targeting children, and the same design features are shaping user experience. There are also worries that their parents could restrict LGBTQ+ youth from accessing information regarding gender and sexuality and online communities that can serve as critical lifelines.
Meetali Jain, Director of the Tech Justice Law Project, feels that the AADC is a more innovative way of thinking about tech regulation. “It’s not playing whack-a-mole with content moderation. That's not the focus of the bill. It’s not the intention of the bill to be a content moderator,” Jain told Tech Policy Press. “It's really to think about how the government can incentivize tech companies to think differently about the design decisions that they make, such that the user experience will be better.”
At the NCSL panel mentioned above, Carl Szabo of NetChoice took square aim at the AADC model. He said his wife, a child therapist, pointed out that in a clinical setting, they refer to things as “developmentally appropriate” rather than age-appropriate, given that children often mature at different rates, incommensurate with their age. He also opposed age verification and said that schemes to require it would force technology firms to collect even more information on users, creating a “treasure trove for data thieves.”
Szabo said NetChoice opposes these bills because they take away the rights of parents and put them into the hands of the state. He believes a better model bill was passed in Florida that provides resources to educate children on the safe use of technology. And he said that parents need to be more active in setting boundaries for minors. He used the phrase “take the phone away” three times and emphasized that parents need to model better behavior when it comes to using mobile devices.
In response, Georgia Sen. Sally Harrell (D-GA40) said, “I'm hearing a lot of ‘the parents need help. The parents need more education.’” “Carl, I can tell your children are young because you said you just took the phone away when something went wrong. By the time my kids hit middle school and high school, I could not take the phone away anymore because the teachers were making homework assignments on the phones. I didn't want my kids to be on Instagram. There was an English assignment where you had to create an Instagram account to complete the English class assignment.”
Likewise, Alaska Rep. Dan Saddler (R-AK24) called “many of the arguments” that Szabo offered “disingenuous” and said “this is a public health crisis.” “I applaud Utah's effort, and I look to try and copy it in Alaska,” said Rep. Saddler.
Szabo said he was confident that his perspective would win the day in court. “I would almost say look at the scoreboard because when we get our decisions against the States of California and Arkansas, it's not just going to be Carl Szabo hypothesizing. It's going to be me basing it on 200 years of law,” he said.
But Sen. McKell remains unfazed by legal challenges, suggesting that regulation of child online safety is inevitable in the long run.
“Look, we could lose a few lawsuits,” said Sen. McKell. “I don't think the State of Utah is worried about that. I think we're far more worried about our kids, and we are definitely going to put our kids before those lawsuits. We may lose some challenges. You may lose some challenges in your states. There were some tobacco cases that were lost early on, but in the end, you guys saw the scoreboard, and I hope you look towards that in your own states.”
What about the Federal government?
Sen. McKell urged other lawmakers to pass legislation addressing social media, but pointed to the possibility that different approaches may ultimately result in federal legislation. “Your bill doesn't need to look like Utah,” said Sen. McKell. “If you remember the fight with online sales tax, a lot of states passed a lot of different bills, and in the end, it forced the federal government to act.”
Action by the federal government is possible, if unlikely to be imminent. There are two federal bills to pay close attention to: the Kids Online Safety Act (KOSA) and an updated version of the Children’s Online Privacy Protection Act of 1998 (COPPA 2.0). KOSA is a bill sponsored by Sens. Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), and was recently reported out of committee with overwhelming bipartisan support. COPPA 2.0 is sponsored by Sens. Ed Markey (D-MA) and Bill Cassidy (R-LA). It aims to update online data privacy rules in accordance with today's internet. More specifically, it wants to build on COPPA by raising the personal data collection prohibition age from 13 to 16 years old, create an “eraser button” for parents and kids to delete a child’s personal information, and establish a Digital Marketing Bill of Rights for Teens.
Action at the federal level would indeed change the calculus for state legislators. But few expect much out of a divided Congress in an election year. For now, the action will remain in the states, and in the courts, with some probability that one or more disputes will end up before the Supreme Court.
This piece has been updated with details of the preliminary injunction in California.