Can the American Privacy Rights Act Accomplish Data Minimization?

Joseph Jerome / Apr 11, 2024

Joseph Jerome is Visiting Assistant Professor at the University of Tampa and a Tech & Public Policy Visiting Fellow at Georgetown's McCourt School of Public Policy.

The announcement of the bipartisan and bicameral American Privacy Rights Act (APRA) begins the latest episode in the United States’ interminable quest to enact a national privacy standard. This saga has been going on for decades at this point, but APRA is an important proposal for a couple reasons. First, as I will discuss at length, APRA, like the American Data Protection and Privacy Act (ADPPA) before it, is noteworthy for embracing a theory of data minimization and rejecting the failed “notice and choice” framework. This is a big deal, but it arrives in a legislative proposal that is trying to do too much in too fraught of a political environment.

Data Minimization

Gizmodo’s recent overview of APRA relegates the introduction of data minimization requirements to almost a footnote. Historically, comprehensive federal privacy legislation has been stuck on how preemptive it should be of state laws and other federal statutes, and how it can be enforced. Most press releases, public statements, and summaries of APRA zero in on these two topics.

Yet APRA’s shift toward a remarkably strict data minimization standard cannot be underappreciated. Instead of operating under the fiction that consumers can meaningfully agree to elaborate privacy policies, APRA begins with the presumption that businesses should not process data “beyond what is necessary, proportionate, and limited to provide or maintain a specific product or service.”

This is not a novel idea. There are longstanding privacy principles that stress minimizing collection of data and putting forward clear and specific purposes for its use, but such legislation is a sea change from privacy proposals put forward during the Trump and Obama administrations. A decade ago, the prevailing attitude was that privacy protections should focus on curtailing data misuse rather than attempting to limit the mere collection of personal information. However, APRA recognizes that the cavalier collection of personal information, by itself, creates a digital ecosystem prone to mischief. Car infotainment systems snitch on their drivers, various parties are trying to figure out who’s pregnant when, and adtech can be used to surveil world leaders. While companies often resist embracing data minimization, the concept is winning the day in the realm of law and policy.

The data minimization provisions are the most consequential part of APRA. Language prohibiting processing “beyond what is necessary, proportionate, and limited” would seem to be stricter than ADPPA’s directive that data processing be “limited to what is reasonably necessary and proportionate.” Even though the words are similar, the legislative drafting suggests companies have less discretion to determine what may be reasonable. This could be good or bad depending on how one views the need for businesses to have flexibility, but it could stop the endless cycle of debates about whether Facebook’s latest privacy policy update is some sort of privacy “gotcha.”

Flexibility via “Permitted Purposes”

APRA does try to provide companies with some flexibility, however. While the bill starts with a presumption against expansive data processing, that isn’t the end of the story. The bill delineates an additional fifteen “permitted purposes” for data processing, and depending upon how one reads the proposal, there could be other data processing activities that might be allowed.

These main exceptions are worth listing out in full and include:

  1. Protecting data security;
  2. Complying with legal obligations;
  3. Making legal claims;
  4. Transfers to law enforcement pursuant to a warrant, administrative subpoena, or other lawful process;
  5. Effectuating a product recall or fulfilling a warranty;
  6. Conducting market research (which requires affirmative express consent for consumer participation);
  7. With respect to data already lawfully collected under APRA, de-identifying data for use in product improvement and research;
  8. Asset transfers in mergers and acquisitions;
  9. Telecom and mobile carriers providing call location information for emergency services;
  10. Preventing fraud and harassment (though not for selling to government agencies, including law enforcement);
  11. Responding to an ongoing or imminent network security or physical security incident;
  12. Responding to ongoing or imminent security incidents or public safety incidents (though not for selling to government agencies, including law enforcement);
  13. Responding to criminal activity (though not for selling to government agencies, including law enforcement, and not health information);
  14. Processing non-sensitive data for first party of contextual advertising; and
  15. Processing non-sensitive data for targeted advertising.

This list is several fewer than the seventeen permissible purposes in ADPPA, but it still creates a lot of potential exceptions to the blanket rule. To be clear, some of these are common exceptions and necessary for basic business activities.

Still, other permissible purposes are novel, or potentially broad and problematic. APRA’s drafters seem to recognize this, as well. For instance, permissible purpose 13 would seem to recognize the heightened sensitivity of health data post-Dobbs and explicitly excludes using “health information” to respond to criminal activity. Other permitted purposes attempt to exclude sharing with government entities entirely.

APRA also includes market research and de-identifying data as permissible purposes. Elements of these provisions also appeared in ADPPA, and this seems to be one way that APRA attempts to permit companies to use data creatively internally. In other words, these are the permitted purposes that can be put forward to argue APRA is not “hindering innovation,” though I cannot imagine this will satisfy critics. For instance, the “de-identified” permissible purpose is limited to data that was previously collected under APRA, which could prevent this purpose from being used as the exception that swallows the rule, but also could be argued doesn’t allow enough data to be de-identified to be useful for internal or research purposes. Further, APRA sets up a collision between those who would have federal privacy rules set a baseline for regulating artificial intelligence and companies that are already arguing that AI advances warrant redefining data processing and personal data under existing privacy regimes.

Enter the Federal Trade Commission

I raise these issues not to suggest any particular exception is problematic, but rather that much more discussion is needed to understand what data processing could be allowed. With many of these exceptions, the devil will be in the details.

Solving this appears to be a job for the Federal Trade Commission (FTC). The FTC is tasked with providing guidance on what is reasonably necessary and proportionate to comply with the bill’s data minimization requirements. (Curiously, this provision uses the “reasonably necessary and proportionate” formulation that exists in ADPPA instead of the stricter data minimization standard in APRA.) The FTC has ample context to pull from in the draft text, but this remains an unenviable responsibility.

I understand the appeal of letting the FTC elaborate upon a set of statutory permitted purposes – and potentially expanding upon them. While at the Center for Democracy & Technology (CDT), I worked on a federal privacy proposal based on prohibiting “unfair data practices,” which is the inverse of what APRA and ADPPA propose. Whereas that effort allowed the FTC to stop certain business activities, APRA would instead make no affordances for new future use of data. FTC guidance is not the same as explicit rule-making authority, and it is not clear how new permissible purposes can be established except by congressional action. Regardless, this runs the risk of putting the FTC – still an under-resourced regulator – in the awkward position of shaping how companies can use personal information in innovative ways.

Is ‘Legitimate Interests’ a Better Approach?

No discussion of APRA or ADPPA can be complete without some sort of comparison to the EU’s General Data Protection Regulation (GDPR). Data minimization is an important principle of the GDPR, which is often put forward as the global standard for regulating data. Under the GDPR, any processing of personal data must flow from a legal basis, and while this frequently gets conflated with a blanket consent requirement, companies are also legally permitted to process personal data where they have a “legitimate interest” to do so. Many of APRA’s permissible purposes, including ensuring information security or preventing fraud are well within a company’s legitimate interests.

But the GDPR concept of legitimate interests does not map onto APRA. A list of purposes is not the same as a balancing test. When not busy lobbying against privacy reforms and arguing privacy rights help child predators, data brokers have also highlighted legitimate interests as a GDPR provision that “accommodate[s] innovation.” RELX’s former chief privacy officer, Michael Lamb, explained that legitimate interests allows a company “to develop innovative data uses, but the controller must perform and document a balancing test that demonstrates why its interests are not overridden by the interests of the data subject. Moreover, that balancing test can be challenged before a regulator, thereby allowing data innovation to be subject to regulatory oversight.”

A company’s legitimate interests assessment must be documented and justified, but privacy advocates remain skeptical of this approach. Access Now, for example, has argued that legitimate interests can undermine important data protection objectives by letting companies make their own determinations about when their interests outweigh people’s privacy rights. Data brokers endorsing legitimate interests likely give credence to this concern, in which case an itemized list of permissible purposes may be a preferable option. However, APRA’s approach is much less flexible for industry, which is ironic considering Congress once lambasted the GDPR for its perceived negative impact on innovation.

Does It Even Matter?

Innovation is a word that does not appear in either the text or any of the press releases announcing APRA. It is a proposal that absolutely advances the privacy rights of Americans, and it achieves this not only by minimizing data collection but limiting how data can be used. However, we should be clear about what the bill does and does not allow. Understanding that is beyond my current understanding.

Part of the challenge presented by APRA is that it is a sprawling 138-page bill. It shares many similarities to ADPPA, but much has happened in the wider field of technology law and policy in just the two short years since that bill’s introduction: California enacted an Age Appropriate Design Code Act under the guise of protecting minor’s privacy, the European Union’s AI Act was passed into law amidst the generative AI hype cycle, and US governmental entities have taken action to limit access to TikTok on national security grounds.

If APRA is a new episode in an ongoing national privacy saga, it’s debuting in an entirely different season than ADPPA. The political landscape has shifted significantly, and while the importance of data minimization cannot be overstated, it is one principle competing among many when it comes to building a federal privacy framework. A multifaceted concept like privacy has always been difficult for lawmakers to grasp, but now privacy scholars and thinkers are starting to raise concerns that privacy, as a concept, has been spread too thin. Ryan Calo warns that overuse of the term privacy “risks its diffusion into a meaningless catchall.” Omer Tene and Jules Polonetsky caution that privacy could become “the law of everything.” Eric Goldman has more colorfully complained that privacy law is “devouring internet law” to everyone’s detriment.

I worry that APRA simply must appease too many masters. My interests in data minimization can swiftly be overtaken by debates about provisions designed to regulate artificial intelligence, protect digital civil rights, and curb the dominance of Big Tech. Already, we have ranking committee members in both the House and Senate concerned that this bill either doesn’t do enough to protect children or could be used by the government to police speech. This is how the political sausage is always made, but no one should be under any illusions that time has made APRA a more attractive option than ADPPA was in 2022.

Will APRA ultimately be enacted by Congress? There are many obstacles to getting APRA signed by President Biden, but APRA’s successful navigation of our complicated political and technology policy landscape may not be necessary for the law to have an impact. Data minimization requirements are making their way into state privacy laws, and even where those laws are inadequate or miss the mark, there is now more regulatory enthusiasm and capacity to protect privacy than ever before. If nothing else, APRA keeps that momentum going.



Joseph Jerome
Joseph Jerome is Visiting Assistant Professor at the University of Tampa and a Tech & Public Policy Visiting Fellow at Georgetown's McCourt School of Public Policy. He was previously a policy manager at Meta Reality Labs. Before that, he worked on state advocacy efforts at the Center for Democracy &...