Home

Donate
Podcast

Unpacking the SECURE Data Act

Justin Hendrix / Apr 26, 2026

Audio of this conversation is available via your favorite podcast service.

With artificial intelligence systems increasingly deployed by companies and governments to hoover up every possible unit of data and to make consequential decisions about people's employment, benefits, credit, education, housing, and health care, the United States still has no comprehensive federal privacy law.

This week, House Republicans put a new bill on the table. But today’s guest says it has significant structural weaknesses even as it seeks to preempt stronger state protections that are already in place. Let’s jump right in.

Writing for Tech Policy Press, Eric Null, director of the Privacy & Data Project at the Center for Democracy & Technology, says “Without significant improvements, the Act would fail to protect peoples’ privacy while giving companies a free pass to continue engaging in the same data practices consumers have grown to hate.”

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

Eric, I'm pleased to have you for the first time on this podcast and have appreciated your analysis and your work over the years. And grateful that you took the time this week to write about this new privacy legislation that's come out of the House Energy and Commerce Committee for us. Thank you very much. We're going to talk about that and talk about a few other things, but I thought I might just start by asking you to frame generally what do you get up to in the privacy and data project at CDT?

Eric Null:

Sure. So we cover a lot of basically anything commercial privacy. So obviously AI is front of mind at the moment, but we also have projects on health privacy and car privacy. We also think a lot about AI and discrimination. And we cover a lot of issues, but also at CDT, we have a variety of different subject matter teams. So if you're thinking about government surveillance, that's actually a different team at CDT. If you're thinking about how non-law enforcement government uses data and uses AI, that's actually a different team, et cetera. So we cover a lot of issues at CDT and my team is focused specifically on commercial privacy issues.

Justin Hendrix:

Well, as Cameron Kerry put it at Brookings, “Springtime in Washington means it's time for another round of federal privacy legislation.”

Eric Null:

Yes.

Justin Hendrix:

The National Law Review said, "Here we go again." Before I get into the specifics of the SECURE Data Act, I thought I might ask you just to characterize for us, if you will, the last few years and efforts at federal privacy legislation in the US. What do my listeners maybe who aren't totally familiar with US policy, what do they need to know about these other acronyms, ADPPA and APRA?

Eric Null:

Sure. We've had the variety of fits and starts with privacy legislation at the federal level going back decades, honestly, but I'll just cover the last several years. In 2022, we had the American Data Privacy and Protection Act, which is ADPPA. That was the first big push for comprehensive privacy legislation at the federal level after states had started passing comprehensive privacy legislation. So I think by then California, Connecticut, Virginia, and a couple of others had passed legislation. And yeah, ADPPA was the first big push. It was bipartisan. It got out of committee and then ultimately never made it to the House floor, so it didn't move beyond that.

And then in 2024, we had another opportunity with the American Privacy Rights Act or APRA. That did not get as far as ADPPA did. It had a bunch of newer thinking from the privacy space on advertising and a variety of other things that it was a bit more specific and a bit more well-thought-out, but that bill also did not make it to a vote.

It didn't even make it out of committee. So we have the third attempt now, although this one is not bipartisan like ADPPA and APRA were before. This is just the sort of Republican salvo of here's what we can agree on as the Republican Party on a privacy platform. And that's what we got on Wednesday.

Justin Hendrix:

So who is to blame for the failure to advance fundamental privacy regulation in the United States? How would you characterize why those prior bills have never made it past go?

Eric Null:

Well, I think a lot of people have a lot of different interests in privacy. You hear a lot of people say like privacy means different things for different people. Some people want to include law enforcement access to data in a privacy bill. Some people want to include cybersecurity provisions. Some people think the nuts and bolts of the privacy protections in the bill are not calibrated correctly. And I think it's really hard given how broad of an issue privacy is and how it affects pretty much every sector and every person in the country. It's really hard to get everyone to agree on the same vehicle. And so each vehicle in the past has had outspoken critics and they are just able to stop. It's much harder to stop something in Congress than it is to get it passed. So once a particular stakeholder or set of stakeholders decides this isn't the right thing, there are a lot of opportunities to basically sync the bill and make it so it can't progress any further.

Justin Hendrix:

Well, things haven't sat still either way. There's been a lot of progress, as you mentioned, in the states, yet it still feels like to me this is like fundamental issue that we have not addressed in the United States and that in the age of artificial intelligence, this is really sneaking up on us. This is really becoming a major problem. I don't know. How would you characterize the challenge as it stands right now? How far behind are we?

Eric Null:

Well, AI is inherently a data issue. Regardless of what the purpose of the AI system is, it is generally trained on a wide swath of data. I think we're seeing this more and more with the consumer facing chatbots and ChatGPT and these services that people are seeing now that sort of speak in natural language and can create images out of nowhere, et cetera, et cetera. And the way that they're able to do that is they are fed data and then they take correlations from that data and then that's how they're able to make their outputs. The general wisdom so far has been that more data is necessary to make AI better. I don't actually think that's necessarily true, but we even saw an announcement recently from Meta that they're going to start tracking everyone's what they write on their keyboards and how they move their mouse around the screen to try to learn how to do the job of a Meta employee to help train AI that presumably will eventually replace those employees.

So we see, and we've seen articles saying companies have already run out of data to train it on because they've already crawled the entire internet. They've already bought all the data they can buy from data brokers and other companies and they fed it all in. I don't necessarily think it's totally true that more data means better outputs. I think if you're trying to get a particular type of quality output, then it actually might behoove companies to trade it on less, more accurate data. And we have problems with hallucinations. We have problems with non-factual information coming out of AI systems, and that's at least in part a training data problem. So I think companies should be on the lookout for how to not just feed more unorganized data into an AI training dataset, but actually trying to figure out how to create the best outputs from the system based on the training datasets they use.

Justin Hendrix:

Okay. So we've got this new acronym, SECURE, which stands for “Securing and Establishing Consumer Uniform Rights and Enforcement over Data Act.” In your piece for Tech Policy Press this week, you called it a major step backward. Why is it a step backwards?

Eric Null:

This bill essentially takes the industry-friendly approach that many states have taken, and that meant that industry has gotten through the process in a lot of states and peels back even more from that. And so this is, I believe the reporting said that this is mostly based on the Kentucky law, but even the Kentucky law includes language around impact assessments. It includes data from smart TVs as a form of sensitive data. Neither of those is in the SECURE Data Act. There's a variety of other protections that states provide. Some define sensitive data as including financial data, some define it as including health data beyond diagnosis. The definition of sensitive data in this bill is limited. The health data is limited to diagnosis and nothing else. So anything else that's not a diagnosis is somehow not sensitive data. Neural data, contents of communications, there's a variety of different things that are not covered.

But I think the big thing, there are two big things here. One is that they've adopted the data minimization standard from most of the states, which is essentially status quo companies, as long as they disclose what they do in a privacy policy, they're allowed to keep doing it. That is not a meaningful change in privacy. That is basically just stating that whatever you're doing, you can continue to do, and now we can pretend that we actually protect your privacy when really we don't. I think the other thing is that there are very wide exemptions in this bill that basically allow companies to not have to comply with this bill at all. Some of those exemptions, which I talk about a little bit in the op-ed is there's one about providing a service to a specific service to an individual. I think companies could very likely make an argument that a lot of the data that they collect is to provide service to an individual, and therefore they're essentially exempt from the bill.

There's another one. There's another exemption that covers contracts and some jurisdictions consider terms of service and privacy policies to be contracts. So the irony here is that if you write a privacy policy, you actually don't need to follow the privacy law at all. And then there's another exemption for internal research to improve and develop new products, services, and technologies. And that, in my view, is basically the AI training data exemption. If you collect data to train an AI system, none of that data is subject to the law, this bill.

Justin Hendrix:

Okay. So let's just break it down maybe in even more basic terms. So when it comes to data minimization, a company has to disclose to a consumer in a privacy policy, here's the information we're collecting and here's how we're going to handle it. As I understand it right now, companies have to abide by whatever commitments they make. This is basically just making it the case that companies can set the rules.

Eric Null:

Yeah, I think that's pretty much right. I think that's how it's played out in the states that have this same standard. The one caveat here is that for sensitive data, which again is defined quite narrowly in this bill, you do have to get what's called opt-in consent before you can collect or process that data. However, companies are highly incentivized to get people to opt into that data collection. So if a company wants a precise geolocation, which is defined as sensitive in the bill, they need to get the consumer's opt-in consent for that if it's not service related. And there are no protections against what we call dark patterns, which is essentially designing an interface or making design choices that incentivize consumers to give over more information than they otherwise would. So the most common one that we all see is when you see a cookie pop-up banner, the accept all is bright and blue and everyone, it's very obvious that it's there.

And then the reject all, if there even is a reject all button, which is not required in the US, is grayed out, dark, you can barely see it. Or another one is it'll just say, accept all, and then there'll be another one that says more options. And then it requires you to go to another page to then decide which types of cookies you want to collect. So without a dark patterns protection, companies can just bombard customers over and over and over again to collect their sensitive data. And consumers are rightly going to just get annoyed and say, "You know what? I'm sick of these popups. I'm just going to say yes." And then even though they wouldn't necessarily want to allow the company to collect location data, they end up being forced to do it. So it ends up being a very weak protection for sensitive data. And then for non-sensitive data, there's essentially no protections at all.

Justin Hendrix:

So another area you're concerned about is civil rights. I'll just point out it's not just you. I saw a statement by Alejandra Montoya-Boyer from the Leadership Conference's Center for Civil Rights & Technology, basically saying that the bill falls short in protecting our civil rights and the digital economy, that privacy rights are civil rights, that we don't see the types of protections against bias and discrimination that we would like to see perhaps in a bill like this. And how would you characterize the challenge when it comes to civil rights?

Eric Null:

So one of the big challenges right now is that the administration is trying to undermine the enforcement of civil rights in general. They're trying to get rid of disparate impact as a form of civil rights protections at the federal level. They're basically making it very difficult to prove discrimination based on a protected class. So that's the overarching concern that we have now. Generally speaking, what we see with technology, and particularly with what we used to call, or still call automated decision-making systems, but increasingly just call artificial intelligence generally, is that these systems can discriminate based on protected classes. And this gets back to the training data, at least in part, if you have training data that says white men are the most successful in these positions at my company, then when you ask the system to recommend people out of 10,000 resumes, chances are you're going to get mostly white men back, and that's not good.

Arguably illegal, probably is illegal. And that's the kind of thing we're trying to get at, but because there's very little transparency, there's very little ability to go after these companies for civil rights violations, we've seen a couple of reports, and that's basically the extent that we have right now, the transparency around these systems is very limited. So it's hard to tell whether a company denied a person for a loan because they have some legitimate reason they denied them the loan or because they're Black and that their automated decision system decided that Black people are less likely to pay it back or whatever. And so that's the kind of discrimination that we're trying to address. And the protections that are included in the SECURE Data Act are basically not protections at all. It basically says whatever is already illegal is still illegal under federal civil rights laws, and that's that, which we already know that's illegal.

Whether it's illegal is not, is not the problem. It's whether we can actually, we have the transparency and the knowledge to go after these companies to do it. And also whether disparate impact continues to be a cause of action at the federal level and this administration, as I said, is going after that. I think the biggest problem with including this throwaway line on civil rights in the bill is that preemption, which is another thing I wrote about in the op-ed, basically says, "And no state or a locality is allowed to enforce a law or a rule that relates to anything in the act." They've included this one line about civil rights, which means state civil rights laws are probably going to get preempted if this passed, which is a huge problem because state civil rights laws are where we're going to get most of our civil rights enforcement, at least during this administration, and especially if this administration is not enforcing just disparate impact liability.

Justin Hendrix:

So let's tunnel into that just a bit. This is section 15 of the bill, this relationship to state laws. You write that it could wipe out every state civil rights law. Texas biometric privacy law, Illinois' Biometric Information Privacy Act. These have been some of the laws out there that have had the most teeth. We've seen fairly large settlements and judgments against companies under these laws. I don't know, how does that mechanism work? Why would it necessarily neuter those laws?

Eric Null:

So there's a variety of different types of preemption and they're all scoped in different ways. It goes from narrow to broad. Usually the narrowest is what we call conflict preemption, which is basically if a federal law and a state law directly conflict like a company cannot comply with both at the same time, the federal law wins and the state law gets preempted. That is generally what we want in terms of privacy protections, because we want states to be able to go above and beyond and protect privacy in a more protective way. And then you get the other side, which is what's called field preemption, which is to say, Congress has regulated this entire field. So privacy, civil rights, AI, algorithms, whatever you want to... And the scope of the field would also be subject to litigation. But that's the broadest that's like, okay, nothing can touch AI, for instance, would be field preemption.

They've included what's called relates to preemption, which is basically one step back from field preemption. Anything that relates to any provision in this bill would get preempted. So that's why I say putting in this line about civil rights, this means the federal government has addressed civil rights in this bill. Anything that relates to civil rights protections and data use at the state level would get preempted. The same thing with biometric data. Biometric data is protected as a sensitive category in this bill. It is therefore regulated by the federal government and anything that regulates biometrics at the state level is going to get preempted. And so on and so forth throughout the entire bill, anything that's related to the provisions in the bill, no state can enforce or pass any laws on those topics.

Justin Hendrix:

Let's talk about another sticking point, at least in prior negotiations over privacy legislation, which has been the private right of action that seems to be completely abandoned here.

Eric Null:

Yes. There is unsurprisingly no private right of action in this bill. The way that that has progressed in prior attempts at privacy legislation is that, so the Republicans tend to disfavor them for a variety of reasons, including an overactive plaintiff's bar. And so they're generally more opposed to statutory damages or liquidated damages, which we see less and less of now because there's been a series of cases at the Supreme Court level that have made it difficult to enforce statutory damages on data related harms. And then on the other side, you generally have the Democrats who are pro private rights of action, pro allowing individuals to vindicate their rights in court. And what happened in ADPPA and APRA back in 2022 and 2024, there was a middle ground. We came to an agreement on a private right of action that was focused on injunctive and equitable relief.

And that essentially means if a company is violating the law, you can't get damages from them necessarily, but you can get injunctive relief that you can prevent them from engaging in the practice. And that particularly for civil rights protections, I think that is very important. But even for minimization purposes and other data practices that are subject to the law, it's important to have an ability to force the companies to, and companies subject to this law to not be engaging in illegal activity.

And one of the ways that we do that is through the private right of action, because we love our FTC, we love our state AGs, they're chronically under resourced. They obviously cannot vindicate all the harms that the hundreds of millions of people in the United States experience based on these company practices. So we in generally, and the default for most of the time the country has been around is that we let people enforce their rights in court. And so for some reason, privacy is an exception to that rule.

Justin Hendrix:

And if you are reading this bill in one of the states that does have a strong privacy law, what do you think you're thinking right now if you're in Sacramento?

Eric Null:

Yeah. So if you're in one of this many states that has stronger privacy protections, California, Virginia, arguably Kentucky, Maryland in particular, which has actually pretty decent minimization language. Obviously, you definitely don't want your stronger privacy law to be preempted by a weaker federal law, even though the bill allows state AGs to enforce it, but it only allows it to enforce it in federal law. State AGs like their state courts. They have state courts that use them. They're most familiar with them. They don't necessarily want to be forced into federal court, which is what preempting their laws would do and would force them into enforcing the SECURE Data Act in federal court. But I think in general, there are many states that have stronger protections than this bill has. And so they certainly should be making their case that their better state laws should not be preempted by a weaker federal standard.

Justin Hendrix:

So what do you think are the prospects for this bill to move forward in this Congress? We've got a kind of fractured scenario. Folks are distracted by a war. There's a lot happening on Capitol Hill. How do you read the politics? Is this going to make it out a committee?

Eric Null:

I'm not a political expert. I will caveat that. I think this year is particularly difficult to move something because of the election, because of the war, because of a variety of different things. ADPPA and APRA came out around the same time of year as this, but they were already bipartisan at that point, and so it could move through the process faster. And I think either of those bills could have gone through the full process during those years. I think this one is further behind. And as Republicans have said many times, this is a starting point. They do expect it to change, which I think is good, but I think it's just so fundamentally weak that we really should be starting from the ADPPA or APRA model rather than something like this.

Justin Hendrix:

Okay. So we've talked about a lot of things, right? We've talked about the civil rights challenges. We've talked about the challenges to the state picture. We've talked about AI's privacy implications. Do you think in the near term, looking even beyond this bill, there are scenarios where we get meaningful federal privacy legislation in the United States?

Eric Null:

Predicting this has basically become an impossible task. I think there's like Hope Springs Eternal. It is something that I would love to see happen in my lifetime, and I have a long life ahead of me, so hopefully it does happen. I do think that I do have some hope for the states. There have been some pretty close calls on passing good privacy legislation. Obviously, Maryland has a good privacy law. Maine got very close. Massachusetts is working on a good law. I think there's a lot of hope to be had there. At the federal level, it's a little harder to have hope, but I still do hope that we can come together. We have twice in the past, and I think we can again in the future, whether it's using this as a base or using ADPPA and APRA as a base, although I strongly prefer the latter.

I don't think I can predict whether or when it will happen, but I remain hopeful that something that we can get something out of Congress at some point, because it would be great to not have just a small... Some percentage of Americans have privacy protections, but everybody has privacy protections. And ideally, the sort of world I'm envisioning in my head is people just go online, they buy things, they interact with people, they upload stuff, they just get to interact with people and hang out and do cool stuff. And they just don't have to worry about their privacy because they know they have a strong federal privacy law protecting them. They know that if the company is violated, that there's going to be strong repercussions as a result. But right now we don't have that. We have cookie banners and companies that do not care at all about your privacy and will extract anything they can out of you to further train AI, to better target behavioral ads to you, et cetera, et cetera.

Justin Hendrix:

And ultimately to sell it back to the government and law enforcement.

Eric Null:

Also that. They'll sell it to a data broker. The data brokers will sell it to any buyers that come along.

Justin Hendrix:

So let me ask you about one last thing you were working on, which is in California, something called the Base Act. What's this all about?

Eric Null:

So there's obviously been a big push recently to come up with new competition protections, new competition remedies. A lot of people feel that companies getting too big, they're getting too powerful, their monopolies, et cetera. I'm not a competition expert. As I said at the beginning of this, CDT has lots of teams. Competition is one of them, but I did work with my competition counterpart to put together a letter because the Base Act out of California is a law that would've forced...

There's a lot of not self-preferencing these sorts of things that we see in a lot of different bills, but in particular, there was a provision that required force interoperability between essentially large platforms and then essentially any third party that sought to interoperate with those services. And often what you'll see in these bills is some form of privacy protection to allow for a company to say, oh, there's actually a privacy interest here that matters.

And so we're actually not going to share data. And as long as that's not a false pretense to just hoard data, that's actually a good thing in general, but the Base Act doesn't have any privacy protections built into it. And so essentially what we said was there's a variety of different data-related harms that could come out of forced interoperability. One of them is forced decryption of encrypted messages. iMessage is encrypted end-to-end by default. WhatsApp, at least consumer to consumer, is end-to-end encrypted by default. There's also many of the major companies have built in protections for when law enforcement comes seeking data. They will challenge over broad subpoenas or warrants or whatever to try to narrow them to protect the data that they collect from getting into the hands of law enforcement. None of those protections necessarily exist at any of the third parties that would be interoperating with these companies.

We already saw with the signal prior notifications being held in your phone memory being used to check your even deleted messages were still there. That was just a small thing and that was just by happenstance, Apple happened to keep that data for some reason, although I believe they've since fixed it. But here we're talking about forced decryption to share with any third party that wants to interoperate with iMessage or something like that. If you want to get a third party watch and you want your notifications sent to your watch, Apple would have to decrypt the message to the watch or WhatsApp would have to do the same thing. So there are serious privacy issues with a competition bill like that, and I think they can be balanced, but in the Base Act, we did not feel like they were properly balanced.

Justin Hendrix:

Is that a problem that extends beyond our borders as we see that problem and potentially other interoperability measures?

Eric Null:

Yeah. So I think I'm not an expert on these areas of law, but I do believe that Japan's law has a pretty decent privacy protection built into their competition law. And I believe that the EU could be better, although it still also does incorporate some privacy protections. So yeah, it's all over the place. But yeah, our view is if you want expanded competition, we're all for it. We just want not to have serious unintended privacy consequences on the way.

Justin Hendrix:

Eric Null, I appreciate it very much. Would recommend folks follow your work over at CDT and hope that we can have you both on the pages of Tech Policy Press back on the podcast again in future.

Eric Null:

Thank you very much. Appreciate it.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President of Business Development & In...

Related

Perspective
Congress's New Privacy Bill Is Built on Empty PromisesApril 23, 2026
Podcast
Daniel Solove on Privacy, Technology, and the Rule of LawAugust 10, 2025

Topics