The Future of End-to-End Encryption May Get Decided This Week in Nevada

Riana Pfefferkorn / Mar 12, 2024

The offices of Nevada Attorney General Aaron D. Ford. Source

There’s a serious threat to end-to-end encryption underway right now in Nevada. At a time when other regulators have recognized how vital strong encryption is for protecting communications and other personal data, the Nevada Attorney General has decided to go in the opposite direction: he’s gone to court to try to force Meta to roll back, for every child in the state, its long-awaited switch to using end-to-end encryption by default in Messenger. Unsurprisingly, Nevada is framing its demand in the interest of child safety. However, its request would make the state’s children more vulnerable online, not less. And if the state gets its way, what happens in Vegas won’t stay in Vegas.

The promise of encryption

End-to-end encryption (E2EE) is a means of scrambling information so that only the sender and intended recipient (the “ends” of the communication) can unscramble it. No third party can read the data, not even the entity that encrypted it. Thus, an E2EE app provider can’t read its users’ messages nor give anyone else access to them, whether intentionally (to comply with a government demand, say) or unintentionally (in a data breach, for example).

Many popular messaging apps, such as Apple’s iMessage, WhatsApp (also owned by Meta), and Signal, are E2EE by default. Defaults matter a lot, because most users never change default settings. When E2EE is the default, users’ conversations are protected without any action on their part. On Messenger, E2EE has been an option since 2016, but users had to remember to turn it on for each conversation. Five years ago, though, Meta announced that E2EE would become the default on Messenger. And this past December, after years of work, the company finally delivered on its promise.

This is great news for Messenger’s billion-plus users. By keeping out prying eyes, E2EE is an enormous boon for users’ privacy, security, and freedom of expression. True, providers can’t proactively monitor users’ conversations for abuse, plus it’s harder (but by no means impossible) for law enforcement to obtain messages as evidence. That said, the vast majority of the tens of billions of messages exchanged daily on E2EE apps are innocuous. On balance, E2EE’s benefits against hackers, cybercriminals, domestic abusers, foreign governments, and other unwanted snoops far outweigh the downsides caused by a tiny malicious minority.

Nevada’s Attorney General wades in

That’s apparently not the view of the Nevada Attorney General, though. In late January, he filed a complaint alleging that Meta designed Messenger to be addictive to children, letting the company monetize children’s data and attention while harming their mental and physical health, privacy, and so on. The complaint’s allegations largely echo the federal lawsuit filed last fall against Meta by dozens of other state attorneys general. But unlike that case, the Nevada AG also goes after end-to-end encryption, calling it a “confounding safety threat” for children. It’s a seemingly minor claim, just 11 heavily-redacted paragraphs in a 528-paragraph complaint, and it’s explicitly unrelated to the addiction allegations. Rather, the unredacted slivers suggest that it’s about sexual predators’ ability to talk to children without Meta listening in.

Since the case is mostly about social media addiction – and since Meta’s default E2EE plans had been well-publicized since 2019, long before their completion in December – it came as a surprise when, in late February, Nevada filed a motion urgently asking the court to force Meta to immediately stop providing default E2EE on Messenger to all under-18 users in Nevada. Why? Because that’s allegedly a violation of Nevada’s consumer protection law, which prohibits deceptive or unconscionable trade practices. Meta had just one day to file a response. Following a flurry of activity and a hearing, the court set a slightly less urgent schedule. Meta filed a longer opposition on March 7. The state is scheduled to file a revised reply on March 14, and a hearing is set for March 20.

Yesterday, I joined a group of digital civil society organizations and privacy-conscious tech companies in filing a friend-of-the-court brief to explain that the Nevada AG’s utterly baffling demand has it exactly backwards. Default E2EE benefits Messenger users, children included. While everyone deserves strong privacy and cybersecurity protections online, that’s particularly true for children. Rather than make them take action to protect themselves (much less understand what encryption is), children, of all people, should have default E2EE turned on. Yet the Nevada AG seeks to compel the opposite outcome. If he gets his way, perversely, Nevada’s children would have worse privacy and security protections on Messenger than everyone else, leaving them more vulnerable to bad actors.

The Attorney General’s move would make children less safe

Providing E2EE to all users is not a deceptive or unconscionable trade practice. To the contrary, failing to provide strong privacy and cybersecurity protections for users’ data has long drawn the ire of consumer protection watchdogs such as the Federal Trade Commission (FTC). In addition to repeatedly sanctioning Meta over its privacy and data security practices, the FTC has penalized other companies for unfairly poor data encryption practices and deceptively misrepresenting the strength of their data encryption. Children’s data privacy and security are hardly exempt, as the Nevada AG suggests. To the contrary, the FTC issued a consent order to an ed-tech company just last year for “lax” practices that resulted in data breaches which revealed children’s dates of birth, sexual orientation, and disabilities.

It’s disappointing to see the Nevada AG promote the tired old myth that E2EE leaves app providers, and thus law enforcement, totally hamstrung in combating abuse. I debunked this myth in a peer-reviewed 2022 article based on a survey of online service providers, including Meta. My study found that “content-oblivious” trust & safety techniques (particularly metadata analysis and user reporting), which don’t rely on at-will provider access to the contents of users’ communications – and thus, crucially, are not impeded by E2EE – are considered more useful for detecting most types of abuse than automatically scanning the contents of users’ conversations (which is no longer possible in an E2EE environment).

That includes types of abuse specified in the Nevada AG’s complaint, such as harassment and self-harm content. Importantly, for detecting attempts to groom or entice children – i.e., the exact reason for the Nevada AG’s demand – user reporting is deemed just as useful as scanning content. And users can still report abusive messages even now that Messenger is E2EE by default. The Nevada AG has yet to explain why investigators couldn’t obtain user-reported messages from Meta with a warrant.

In addition to enabling user reporting, Meta gathers vast amounts of information about its users that it can analyze to detect bad actors on its products, even without access to everyone’s messages. Our brief calls this out as an ongoing shortcoming in Meta’s privacy practices. Metadata about our conversations and accounts can be extremely revealing: it can show where we go, who we know, whom we talk to (and when, how often, and for how long), what our interests are, etc. That information is accessible to the government with proper legal authority. Indeed, Meta complies with tens of thousands of requests for user data from US law enforcement every six months, and the numbers keep going up. While this is disturbing from a privacy standpoint, it nevertheless goes to show how hollow the Nevada AG’s claims are that E2EE stymies investigations.

And it’s not just Meta that serves as a source of evidence. Even if E2EE messages can’t be plucked off the wire or seized from Meta in legible form, law enforcement can often get them from the users’ devices directly, either by the user’s consent to the search or by using forensic tools to pull data from a phone (sometimes even if it is locked). Public records requests revealed that as of 2020, the Las Vegas Metropolitan Police Department had spent over $646,000 on mobile device forensic tools from companies such as Cellebrite. That fact somehow didn’t make it into the Nevada AG’s court filings.

Put another way, there’s simply no need for the AG’s demand to stop protecting Nevadan children’s Messenger messages with default end-to-end encryption. E2EE doesn’t stop user reporting, an effective means of detecting abuse on a service. And in this golden age of surveillance, there’s still a plethora of user information collected by Meta that isn’t E2EE (and probably never will be). All of this was well-known long before the Nevada AG’s bizarrely belated demand over a product feature that was announced five years before it actually launched. In fact, part of the reason Messenger’s default E2EE rollout took this long was precisely so that Meta could address user safety concerns. The Nevada AG should not be heard to complain now.

True, in losing the ready access to unencrypted message contents to which they’d become accustomed, investigators in Nevada may have to do more work than they would if Messenger hadn’t gone default E2EE. But law enforcement isn’t guaranteed that its work will be simple and easy, or that evidence will be available to it at all. A free society requires some friction in law enforcement investigations – whether from technical measures such as encryption, or from legal protections such as the Fourth and Fifth Amendments and laws protecting privacy. Federal law expressly permits Meta to encrypt users’ messages. Users are free to turn on disappearing messages, decline to back up their chats, or talk to someone face-to-face instead of over an app. The government has never, in the history of this country, been entitled to expect that everybody would create and store, in readily accessible form, a permanent record of everything we ever say to one another, just in case the government wanted to review it later to look for possible crimes. Nevada’s children should not be conditioned to expect otherwise. Especially not by an elected official claiming he’s doing it to protect them.

This mistake could metastasize

If the court grants the Nevada AG’s latter-day request after this month’s hearing, the resulting injunction won’t just affect Nevada’s children. Anyone (adult or child) who talks to them, or is mistakenly identified by Meta as being one of them, will no longer get default E2EE on Messenger either. Plus, a successful request in Nevada might inspire copycat demands elsewhere. That multi-state social media addiction lawsuit against Meta that I mentioned above? It has 42 state AGs as plaintiffs. A copycat injunction for Messenger would mean no more default E2EE for most of the country’s children (and a significant number of adults, as said).

Hopefully those other state AGs would pick a wiser course than this one rogue state AG has chosen. Consumer protection regulators have spent years telling Meta to do better at protecting user privacy. Making Messenger E2EE by default is the best thing Meta has done in that regard in a long time. The Nevada AG’s own complaint against Meta says that “[i]n the digital privacy ecosystem, this is a move that might be lauded.” Yet rather than laud it, the Nevada AG is trying to undo it. He would rather force Meta to give the state’s youngest users worse digital privacy and security than everyone else. That isn’t promoting child safety online; it’s undermining it. Even more astonishing, he’s trying to rebrand default E2EE as an unconscionable and deceptive trade practice. Strong encryption isn’t a violation of consumer protection; it’s a vindication of it.

The Nevada AG’s request is so wildly contrary to well-established best practices and long-standing interpretations of consumer protection law that it would almost be funny if it weren’t so dangerous. We can only hope the judge in Nevada laughs him out of court. The children of Nevada deserve better than this.


Riana Pfefferkorn
Riana Pfefferkorn is a Research Scholar at the Stanford Internet Observatory. She investigates the U.S. and other governments' policies and practices for forcing decryption and/or influencing the security design of online platforms and services, devices, and products, both via technical means and th...