The Drive For Age Assurance Is Turning App Stores Into Childhood Regulators
Chris Stokel-Walker / Jan 12, 2026
Shutterstock
The question of who is responsible for protecting children online has flared up again, this time because of Grok. Concerns over the AI chatbot generating non-consensual images have thrown a harsh spotlight on the role app stores play in enabling the mass distribution of apps and services that can be harmful to children. Some have even argued that the recent scandal involving using AI to create images of child sexual abuse material (CSAM) should result in the app being removed from Google and Apple’s stores.
Until now, the question of who governs childhood online has largely been a platform-level argument. It has historically depended on what Instagram permits, how TikTok moderates, or whether an AI companion app enforces age limits at all.
But as governments move more aggressively to address online harms, most visibly in Australia’s decision to bar under-16s from social media, a shift is underway. Rather than attempting to police every app directly, lawmakers and parent groups are increasingly targeting the infrastructure that sits above them. App stores, operating systems, and digital identity infrastructure are being recast as chokepoints for child safety enforcement. That’s fueled by demands from parents, 88% of whom believe app stores should ask parents before their kids can download a new app.
“I think you're seeing a growing tech temperance movement aimed at social media, tech, and AI,” says Adam Kovacevich, founder and CEO of the Chamber of Progress, a tech industry coalition. “It has its roots in Jonathan Haidt's writing and it's fostered by a worry about creeping unfettered technology into family and personal lives.”
For campaign groups like the Digital Childhood Alliance, the focus on app stores is an obvious one. “App stores are the digital gatekeepers of our children’s lives,” Casey Stefanski, executive director of the alliance, said in a statement. “They control what gets through, but until now, they’ve had zero accountability.” That accountability gap has been exploited by Grok to the detriment of younger users.
Others, however, argue that the move to put the onus on app stores is less about them being best placed to act and more about other measures elsewhere faltering. Whether app stores are equipped — or incentivized —to play the safety role is far less clear.
Everyone agrees there’s a problem — but not on how to fix it
Everyone wants the benefits of a safer internet for kids, but nobody wants the liability, reputational risk, or the user friction that comes with hard checks. “What I’ve seen over the past few years is a kind of passing of the buck,” says Sonia Livingstone, a professor studying children, media and the internet at the London School of Economics. “Platforms have been trying to get age assurance to be done by device manufacturers, and they've been pushing it elsewhere.”
The reason is twofold, says Livingstone. “One is about liability,” she explains. “Platforms don’t want to be the ones who are legally liable. The other is about market positioning.” Platforms don’t want any kind of friction for their users when they’re signing up. “To everyone it looks like a cost,” she says. “To everyone it looks like a possible kind of reputation damage and actual liability.”
“It’s very much passing the buck, because the standardization is so ridiculous,” agrees Michael Veale, professor of technology law at University College London. He says that putting the burden of providing age verification into every single app can result in a hodge-podge of different options with varied functionalities. Age verification at that scale would “massively hinder innovation on the internet, unless you can reduce the cost of these tools.” Nevertheless, governments worldwide seem insistent on standardization. Proposals such as the App Store Accountability Act model — and related ideas in the broader “UnAnxious Generation” policy package floated by Representative Jake Auchincloss, a Massachusetts Democrat — share a common goal to push responsibility up the stack so Apple and Google shoulder more of the burden.
Auchincloss’s App Store Accountability Act, which was tabled in May 2025 alongside Indiana Republican Erin Houchin, would allow parents to give to app stores their child’s age when they set up their device, handing over responsibility for blocking access to age-inappropriate apps to the app store owners. It would also allow parents or guardians to have linked accounts and devices with their children to monitor what apps they use, and require parental consent for anyone under 18 to access app stores.
The thinking behind this approach is that it is easy to subvert multiple, app-level age checks. Focusing on the point of distribution, the app stores create a single point of enforcement. Those putting forward the legislative proposals argue that app stores already audit content extensively: deciding which apps can exist, which payment systems are permitted, which content categories are acceptable, and how much visibility services receive. Age checks would simply become another item on that list.
App stores were never built to be safety regulators
“App stores have long taken a role as informal or private governance regulators,” says Veale, who points out that it is “sometimes mandated by law.” But Veale points out that app store checks may seem thorough, but aren’t always, and might not be the same requirements as an age or safety check for users.
“As much as app reviewers want to help, their pay cheque depends on getting apps in the door and keeping developers happy,” says Veale, pointing to figures showing that Apple's app review process gives reviewers about 13 minutes per app to establish whether they’re safe or not. Others are even quicker: disclosure in a 2023 court case revealed that apps on the Samsung Galaxy Store go through checks by Vietnamese staff, who average six minutes per check. “They usually don’t even open the app,” says Veale. Livingstone alleges that the app stores have been “quite extraordinarily lax” in their checks, which makes her surprised that they’re being called upon in this instance.
Policymakers want app stores to act like safety regulators, but the mechanisms they rely on were designed for scale, speed and developer relations, and not for evaluating how apps affect child development or cause harm. That does not mean app stores could not adapt, Veale notes, but doing so would require a fundamental shift in how they operate.
Piecemeal approaches aimed at the wrong problem?
The United States’ bill-by-bill approach, which places age adjudication in the hands of app stores, contrasts with the European Union’s model. The EU has recently backed a non-binding minimum age of 16 for social media and AI companions, tied to an EU age verification app and EUDI wallet. Coupled with queries from the European Commission to Snapchat, YouTube, Apple and Google about how they operate their age verification systems and prevent minors from accessing content they shouldn’t, it suggests Europe doesn’t place that much trust in platforms — nor in app store providers.
The European approach takes the decision away from app stores and brings it closer to governments, which can absolutely verify ages — though it comes with its own questions about privacy and safety.
Whatever approach is taken, the central question remains whether asking app stores to step in is the right solution at all. Kovacevich is skeptical of age-based bans driven by political momentum following Haidt’s work. “We only need to look at the failure of Prohibition in the 1920s to know that bans don't work,” he says, pointing to widespread evasion and dishonesty around age online under COPPA.
He also opposes age verification on privacy grounds. “Age verification mandates typically require invading the privacy of both minors and adults,” he argues. Policymakers, he suggests, would be better served by promoting healthier relationships with technology, an approach that demands more effort than shifting blame up the stack to app stores, after other actors have avoided responsibility.
Authors
