With Congress Silent, the FTC Must Protect Kids from Big Tech
Haley Hinkle, Maurine Molak / Jun 3, 2025
Mark Zuckerberg, CEO of Meta Platforms Inc., center right, addresses the audience, including parents of children injured or lost in events involving social media, during a Senate Judiciary Committee hearing in Washington, DC, US, on Wednesday, Jan. 31, 2024. Photographer: Kent Nishimura/Bloomberg via Getty Images
On June 4, the Federal Trade Commission will host a workshop titled, “The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families.” One of us, Maurine, will be there as a panelist at a session called, “Are Kids in Danger Online?”
The workshop — which will bring together child development experts, parents, and government leaders — will help underscore the FTC’s ability to protect kids from online harms at a time when its leadership on that issue is more important than ever. At the end of last year, the Kids Online Safety Act, which would have been the most important new law to protect kids online in over 25 years, failed to even get a vote in the House, despite the bill passing the Senate 91-3.
As our elected leaders continue to delay this essential legislation, more and more children are being exposed to serious and often deadly online harms. Thankfully, the FTC has the authority to step up in the absence of congressional leadership and take bold action to protect our kids.
The FTC Has The Authority To Address Unfair Design Choices
Specifically, the FTC can use its unfairness authority under Section 5 of the FTC Act to address design choices by online platforms that manipulate minors’ attention. Under the FTC’s Policy Statement on Unfairness, the basis for most cases in this space, a business’s act or practice is unfair if it meets three criteria: the practice results in substantial consumer injury, the injury is not outweighed by countervailing benefits to consumers or competition, and the injury cannot be reasonably avoided by consumers.
Social media and gaming platforms are rife with engagement-maximizing design features that we believe meet all three of these criteria. These features break down into a few broad buckets:
- Features that manipulate how a user navigates a product, including auto-play and strategically timed advertisements that make it difficult for a young user to exit a game or close a platform and eliminate cues that help them log off and put down their devices.
- Variable reward design features, such as so-called loot boxes and other surprise video game rewards and endless scroll feeds that keep minors searching for the next funny, entertaining, or interesting piece of content.
- Social manipulation design features that exploit minors’ desire for relationships, including “like” and “follower” counts and interaction streaks that tacitly pressure minors to post and interact online more often, and with more people. Platforms also leverage one-sided, parasocial relationships by using fictional characters, popular influencers, or celebrities to keep kids engaged online. More recently, platforms have also begun to use AI-powered chatbots to hook young users, including bots that imitate familiar characters, deepening this trend.
Engagement-Maximizing Design Causes Substantial Harm to Minors
When it comes to the first part of the FTC’s unfairness test, we’ve seen firsthand how these engagement-maximizing design features cause substantial harm to minors’ mental and physical health. These features encourage excessive screen time, which in turn is associated with a range of negative health effects. Heavy digital media users are more likely to be unhappy or experience depression or suicidality. Engagement-maximizing design also encourages the kind of excessive screen time associated with problematic internet use, which is linked to psychiatric disorders and sleep disturbance.
Excessive screen time can also increase a minor’s risk of exposure to cyberbullying and dangerous interactions. The FTC acknowledged this harm — and the agency’s ability to address it — in its case against Epic Games for alleged violations of federal children’s privacy law, known as COPPA, and default settings that automatically put children into voice chats with adult players. The FTC found that Epic Games’ practices “exposed kids to dangerous and psychologically traumatizing issues, such as suicide and self-harm.”
Those risks continue to threaten kids on social media and gaming platforms — including those that feature AI companions — which time and again prove quick to engage children in age-inappropriate and dangerous conversations about sexual topics and suicide.
These Practices Offer No Benefits To Consumers Or Competition
Sadly, Big Tech companies know about these harms, but appear unwilling to adequately change their business practices. Lawsuits filed by state attorneys general against Meta and TikTok have revealed the extent to which tech executives are willing to look past clear evidence of negative effects on kids and teens if it helps their companies’ bottom line. Families cannot influence these decisions on their own.
This brings us to the second prong of the FTC’s unfairness test: These practices do not provide any countervailing benefit to consumers — the kids, teens, and families impacted by these business decisions — or competition. Instead, Big Tech is engaged in a race to the bottom, where the companies that can most effectively prey on young users’ vulnerabilities stand to gain the most.
Minors Cannot Reasonably Avoid These Harms
For the final part of the FTC’s test — the requirement that the injury suffered could not have been reasonably avoided by consumers — we can look to the experience of Maurine’s late son, David, who died by suicide at age 16.
Engagement-maximizing features on both gaming and social media platforms pulled David in at a time when he was particularly vulnerable to them. An avid basketball player, David fractured his back from overuse and had to take time off to rehabilitate. He turned to video games and social media to fill the void during his recovery, and over the next nine months, he developed a digital addiction.
David became obsessed with purchasing add-ons to increase his player power, and he spent hours on social media learning tips, tricks, and tactics from professional players so he could improve his gameplay. David’s behavior completely changed during this period. His struggles also led to cyberbullying from his classmates — a harm that was not alleviated by changing schools, because his digital footprint followed him wherever he went.
David’s family could see that something was wrong and tried everything to help him. But his story exemplifies the stark reality of engagement-maximizing design: kids, teens, and their families are not capable of avoiding these harms on their own. Minors are still developing their executive function skills, which help direct attention and behavior and are associated with prioritizing tasks, filtering distractions, and setting goals — skills that are critical for navigating online environments.
Adolescence is also marked by heightened reward- and sensation-seeking behaviors, which lead teens to seek out experiences motivated by reward stimuli. Design features like autoplay, endless scroll, and variable rewards prey upon these cognitive vulnerabilities, overcoming kids’ and teens’ limited ability to control impulses. Neither parents nor their children are any match for the carefully tested design choices Big Tech deploys to influence user behavior.
Why The FTC Must Act Now
Big Tech companies have made their positions clear: Left unchecked, they will try to seize an ever-growing share of our children’s time. But the FTC has the authority under the FTC Act to send a clear message to the market that it will not tolerate design that maximizes screen time and puts children’s well-being at risk for the sake of profit.
The FTC’s unanimous decision in its case against Epic Games was an important first step, but the agency’s Policy Statement on Unfairness empowers the commission to address a much broader range of manipulative, engagement-maximizing design practices. We urge the FTC to investigate and regulate navigation manipulation, variable rewards, and social manipulation design features for what they are: unfair design practices that benefit Big Tech while our families bear the cost.
Children in this country are facing a crisis fueled by online platforms’ exploitative business model. While Congress continues to dither, the FTC has the chance to demonstrate the leadership that our elected representatives have failed to show. The FTC should use its full authority under the law to rein in tech companies’ manipulative design choices and prevent more young lives from being lost to Silicon Valley’s unfair practices.
Authors

