From Dolls to Downloads: Courts Reimagine Product Liability for the Digital Age
Ariel Fox Johnson / Jun 12, 2025
Photo by Shutter Speed on Unsplash
Instead of clamoring for dolls and baseball mitts, kids today spend their allowance money on virtual bundles in apps. Their coloring books and board games are digital. Children (and adults alike) play and engage daily with digital products. Sometimes, they also get injured by them.
Courts are increasingly recognizing this reality — that many of the products youth and adults interact with are on screens or online. And courts are recognizing that manufacturers of such digital products have the same responsibilities to provide safe products to their users as manufacturers of physical goods, especially when those users are children. This legal perspective may offer a critically important way for families and consumers to protect themselves, especially if legislative avenues to obtain new privacy and safety protections shrink.
Under traditional products liability law, product manufacturers and sellers are responsible for the injuries their products cause to consumers. Product makers have a duty to provide safe products–if a bottle of soda explodes while you are sticking it in the fridge and you get hurt, you can seek redress for your injuries. Product liability claims typically fall into three buckets: a manufacturing defect, the product was designed in a defective way, or the manufacturer failed to appropriately warn users of the risks of its product. Manufacturers may be held strictly liable for their unsafe products, or they may be liable for their negligence.
Courts are increasingly allowing plaintiffs to pursue product liability claims against social media platforms, rideshare apps, and AI chatbots. In a recent high-profile case, Garcia v. Character Technologies, Inc., the court ruled that plaintiffs could pursue product claims against the Character AI app. In Garcia, a teenager took his life after becoming dependent on interacting with Character Technologies’ AI “characters.” The court determined that the deceased’s mother could move forward with product claims, such as failure to warn, against Character AI. It also held that Google could be liable as a component part manufacturer, as Google had supplied its Cloud technical infrastructure to help power Character Technologies’ LLM.
Other recent cases have also found that platforms or their features can be considered products, and that children (or their parents) can pursue product liability claims against the tech companies that manufacture and distribute such platforms. Lemmon v. Snap is an early example. In Lemmon, two boys died in a high-speed car crash while they were using a Snapchat speed filter, which was widely understood to reward users if they used the filter over 100 mph. The Ninth Circuit’s decision allowed the youths’ parents to bring a products liability claim against Snapchat for its speed filter, under a theory of negligent design. The court noted that Snap was a “manufacturer” who had designed and made the product available, and that manufacturers have a “specific duty to refrain from designing a product that poses an unreasonable risk of injury or harm to consumers.” Snap may have violated its “duty to design a reasonably safe product,” and it could be held responsible for the platform’s “architecture.”
In a decision addressing hundreds of personal injury cases of children and teens against Facebook, Instagram, Youtube, TikTok, and Snapchat, a federal court in In Re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation considered at length whether products liability theories could be pursued against social media companies (under theories of design defects and failure to warn). The court noted the governing Third Restatement of Torts defines “products” primarily as “tangible” items, but explicitly states that intangible items like electricity can be products“when the context of their distribution and use is sufficiently analogous to the distribution and use of tangible personal property.” The Court also analyzed specific features of social media platforms (such as parental controls, barriers to deactivation, and filters) and asked, “whether the functionality is analogizable to tangible personal property or more akin to ideas, content, and free expression.” The Court found that plaintiffs could pursue claims against social media platforms as product makers for numerous specific product features.
From a public policy standpoint, this increasing recognition of digital playthings as products makes sense. (To be sure, judges still consider the First Amendment and Section 230 in these cases, and still kick out claims on these bases. Also, most of these cases have not yet finally held that a social media company, or chatbot company, is liable for harms from its defective product; rather, they have thus far simply enabled plaintiffs to pursue product liability claims.)
Historically, product liability evolved alongside the rise of mass manufacturing. One court wrestling with the definitions of products in the digital era, Neville v. Snap, notes parallels to the modern experience. In Neville, a California court allowed plaintiffs (whose children had experienced fatal and near-fatal fentanyl overdoses from drug purchases over Snapchat) to pursue claims that Snapchat was a defective product. The Neville court cited a 1940s case about an exploding bottle of Coca-Cola, where a judge then remarked that “the close relationship between the producer and consumer of a product has been altered… The consumer no longer has the means or skill enough to investigate for himself the soundness of a product.” The doctrine of strict product liability evolved in the decades after, with the purpose being that injury costs be borne by the manufacturer, not the “injured persons who are powerless to protect themselves.”
The Neville court explained the evolution of product liability in the 20th century and the rise of industrialization:
Strict products liability arose in the early and mid 20th century from the perceived inability of the law of warranty and negligence to provide an adequate means of redress for injuries arising in anew kind of industrialized economy. That economy was characterized by mass production, the introduction of wholesale intermediaries in supply chains, and the expansion of product advertising. Makers and users of products had an increasingly attenuated relationship. Redress for injury arising from product defect… require rethinking.…. Who should bear the costs of such harms: an innocent purchaser, or the product supplier who had the means and motivation to eliminate defects, and the ability to spread that cost widely via the mechanism of price and the securing of insurance?
It then astutely noted that the complaint at issue raised “similar questions about Snapchat and its place in society and the economy.” The digital revolution is no less disruptive for consumers than the industrial one. Individual users and tech behemoths have an incredibly attenuated relationship. Users are largely powerless to understand, let alone protect themselves or their kids, from the harmful impacts of various technology products.
Courts are also not alone in using traditional business liability theories to hold tech companies to account. Efforts to better hold companies accountable for the harms their products cause and for their negligence are also ongoing in state legislatures. In California, a bill is pending that would increase penalties for large social media companies if they are negligent and their platform features cause injury to minors. Of note, this does not create new liability–just increases financial penalties–as under California law, everyone, including technology companies, owes a standard duty of care and can be liable when their negligence causes harm to another.
With the advent of the internet and the impending AI age, product liability may need to evolve again to adequately protect society and especially its most vulnerable members, such as children. As efforts to create new substantive protections rise and fall in the states and languish at the federal level, judicial recognition that companies can already be held accountable when they put out unsafe and dangerous products should give some hope to aggrieved and injured families.
Authors
