Let’s Make Privacy EasyTom Kemp / Aug 21, 2023
Tom Kemp is a Silicon Valley-based entrepreneur, investor, and policy advisor. He is the author of Containing Big Tech: How to Protect Our Civil Rights, Economy, and Democracy.
When I was CEO of a mid-size technology company coming out with a new consumer-focused service offering, we spent a fair amount of time doing A/B user interface testing before the product's release. I had occasional meetings with individual engineering groups, but with our User Experience (UX) team, I emailed and met with them regularly. I knew, like most of my CEO peers in Silicon Valley, that ease of use and the quality of the UX makes or breaks customer adoption.
Big Tech firms have always set the bar in the tech industry regarding their UX and making their products easy to use. Steve Jobs was famously quoted as saying, “You've got to start with the customer experience and work backward to the technology. You can't start with the technology and try to figure out where you’re going to try to sell it.” For Big Tech, it is essential to not only have a great UX to get consumers to initially sign up but also utilize persuasive technologies to keep consumers on their platforms. This helps these firms build a network effect and enable more user behavioral data to be mined and more ads to be served. For example, Meta has historically experimented with the colors of buttons and the frequency of notifications to get users to return to its platform.
Artificial Intelligence (AI) is now key for Big Tech when it comes to UX and maximizing engagement. Big Tech firms have massive AI R&D investments, and have the further advantage of large computing infrastructures and vast amounts of data that AI needs to be effective. In the case of TikTok, it captures dozens of “microsignals” from users per video, such as when a user likes a video or skips to the following video. This is immediately fed into TikTok’s AI systems, helping them optimize and personalize future videos. This ultimately results in captivating users for as long as possible, to the point where some say TikTok knows them better than they know themselves. And Meta is using AI to predict who would click on an ad and determine who would like or share a post, with the goal of delivering more personalization of ads and content. All of this, of course, is enabled through the continued mining of behavioral and personal data.
Sensing consumer concerns regarding the wide scale collection and proliferation of personal data and growing worries over AI and its ability to do automated decision-making leveraging that data, lawmakers have responded by enacting privacy laws in recent years. In the US, it started in California in 2018 with the California Consumer Privacy Act (CCPA). Twelve more states have passed privacy laws as of mid-2023. But after California passed its pro-consumer privacy law, the tech industry took notice and has been significantly putting its thumb on the scale when it comes to new state privacy laws, including literally writing some states’ privacy laws.
So yes, consumers in twelve states have or will soon have privacy rights (e.g., right to know, delete, correct, etc.). But the vast majority of those laws are crafted so it is neither easy nor intuitive to exercise those rights. Various industry groups have carved out exemption after exemption of what data can continue to be collected. Further exacerbating the situation is that US privacy laws are all “opt-out,” meaning the onus is on the consumer to explicitly say no to the collection or selling of their data with every business they interact with. (Side note: the “opt-out” path was taken in California because lawmakers were worried that an “opt-in” consent framework — with the default being not to collect or sell personal data unless the consumer consents to it — would be challenged on First Amendment grounds due to the 2011 Sorrell vs. IMS Health Supreme Court decision involving a similar matter.)
So, it is not surprising that when you look at the required reporting that large tech firms need to publish regarding Californians' data privacy requests, the numbers are low. Consumers are not taking advantage of the rights they have been given. For example, in 2022, only 5,000 Californians requested that Meta delete their data, and only 16,000 requested to know what Meta was collecting about them. Even in Europe, with its opt-in framework, we see European consumers facing a barrage of cookie banners that has led to fatigue and abandonment by consumers of selecting their preferred privacy settings.
This tells us that exercising privacy rights is too hard for consumers. But let’s be clear; consumers do want privacy protection. For example, in 2020, over 9.3 million Californians voted “Yes” on Proposition 24, the California Privacy Rights Act (CPRA) that upgraded the CCPA. (Full disclosure: I was a full-time volunteer on the campaign to get the CPRA passed.) That is more votes than the population of 10 US states. To put it another way, Prop 24 got more votes in California than presidential candidates Barack Obama or Hillary Clinton got in the state in prior elections. And it’s not just at the ballot box where people express their desire for privacy: when allowed to block third-party tracking, 96% of Apple users turned on App Tracking Transparency.
So, we know that consumers want privacy, and when made simple to enable, they will turn it on. Thus, to unlock the value of consumer privacy rights at the state level and with any future federal privacy law, we need to do what Steve Jobs suggested: start with the customer experience and work backward. Here are three suggestions to make privacy easy for consumers.
Opt-out preference signals
The first step is that state laws and any future federal law must have support for “opt-out preference signals.” This addresses pop-up box and cookie fatigue because a consumer simply needs to turn this signal on at the browser or device level, and then every subsequent website visited would require that the website owner treat the signal as a valid request to opt out of the sale and sharing of their data. This means the consumer sets it up once and does not have to make individualized requests with each business. This gives consumers the proverbial “one and done” in less than 30 seconds.
The good news is that this concept is incorporated in the Global Privacy Control (GPC), which is supported in several browsers and plug-ins, and states like California, Colorado, and Connecticut require the support of the opt-out preference signal (e.g., Colorado calls this “universal opt-out mechanism”). The bad news is that major browsers such as Google Chrome and Apple Safari do not support GPC out of the box, and since it is not a feature of Google Android or Apple iOS it has not been implemented for mobile apps. The tech industry hates this approach and made sure in the watered-down state laws that GPC / opt-out signals are not a requirement. So, we need to push policymakers to make privacy easier for consumers and add this to privacy laws. And we need to remind vendors such as Apple that if privacy is a high priority, then support for GPC at the browser and device level is required.
Hit the delete button with data brokers
The above suggestion addresses companies with which consumers directly interact. But what about companies known as data brokers that we don’t have a direct relationship with? These businesses collect gobs of personal data on every internet user and then sell it to just about anyone with a credit card. We probably want that data deleted, given the increasing weaponization of data being sold by data brokers. But consumers need to figure out whom to contact to request deletion, and even if they looked at online data broker registries maintained by states such as California and Vermont, who has time to reach 500 data brokers and ask every one of them to delete their data?
My second suggestion to make privacy easier is to have the equivalent of the Federal Trade Commission's Do Not Call Registry for data brokers. Proposals in this vein include the proposed federal DELETE Act (put forth by Senators Bill Cassidy, R-LA, and Jon Ossoff, D-GA) and the proposed California Delete Act (Senate Bill 362). [Full disclosure: I co-drafted SB 362]. These bills would create an online portal for consumers to request that data brokers delete any data they have on the consumer and no longer track them. This, too, would give consumers the “one and done” in less than 30 seconds, versus spending hundreds of hours contacting data brokers and requesting they delete our data.
Ban dark patterns
Finally, we need to ban dark patterns, especially those that block our ability to exercise our privacy rights in a frictionless manner. The CPRA regulations describe what constitutes a dark pattern when it comes to substantially subverting or impairing user autonomy, decision-making, or choice in the exercise of consumer privacy rights. This can be the model for other privacy laws and regulations.
- - -
In summary, so much time and effort has been spent passing comprehensive privacy legislation in twelve states, with more work to be done in the other states and hopefully at the federal level. But what good is giving consumers privacy rights if they can’t easily exercise them? Policymakers need to take a page from Silicon Valley and think of privacy UX from the consumer’s vantage point. Let’s stop making privacy hard for people. The tech industry won’t particularly like that, as it will partially cut off the ingestion of data that fuels their business models. But the fact that 96% of Apple users are turning on App Tracking Transparency and 9.3 million voters in California voted for CPRA shows you that consumers overwhelmingly want control over their privacy, and policymakers and regulators should make fulfilling that need just as easy as installing an app or liking a social media post.