Transcript: US Senate Hearing on 'Safeguarding Americans' Online Data'
Justin Hendrix / Jul 31, 2025
The Dirksen Senate Office Building in Washington, DC. Shutterstock
On Wednesday, July 30, 2025, the US Senate Judiciary Subcommittee on Privacy, Technology, and the Law hosted a hearing titled "Protecting the Virtual You: Safeguarding Americans' Online Data." Witnesses included:
- Kate Goodloe, Managing Director, Business Software Alliance (written testimony)
- Paul Martino, General Counsel, Main Street Privacy Coalition (written testimony)
- Joel Thayer, President, Digital Progress Institute (written testimony)
- Alan Butler, Executive Director and President, Electronic Privacy Information Center (written testimony)
- Samuel Levine, Senior Fellow, UC Berkeley Center for Consumer Law & Economic Justice (written testimony)
What follows is a lightly edited transcript.
Sen. Marsha Blackburn (R-TN):
The Subcommittee on Privacy, Technology, and the Law will come to order. And Senator Klobuchar is on her way. She'll be here in a couple of minutes, but we will go ahead and begin since we do have five witnesses and we thank each of you for giving your time and being here today. Today, we are going to put our attention on what I think is one of the most consequential issues up for discussion when we talk about the virtual space and that is how does each and every individual American preserve their privacy and their personal data in the virtual space? The title of the hearing: Protecting the Virtual You: Safeguarding Americans' Online Data. This speaks to what is becoming a growing connection between you and the physical space and what you are doing each and every day in your transactional life in the virtual space or the digital version of yourself. And this comes through how companies collect, track and monetize your data, and every single bit of that is done without your consent or your knowledge.
In today's economy, data is currency. Everything from your shopping habits to your health information, your children's online activity to your political views can be identified, sold, and resold often with little transparency or recourse. Meanwhile, consumers are left to decipher lengthy privacy policies and click "agree" at the bottom of the page even before they can begin to access any online service. The absence of a comprehensive national data privacy framework has left millions of Americans vulnerable. While numerous states have enacted privacy laws, the result has been a patchwork that fails to provide the clarity, consistency, and confidence that consumers and responsible businesses need and deserve. For years now, I have been clear. We need a national privacy standard that is comprehensive and enforceable, one that empowers consumers, promotes innovation, and ensures accountability. It should prioritize transparency, minimize data collection, and provide meaningful consent, not just a box to check.
We have a panel of witnesses here this afternoon who all agree that there is an urgent need for a comprehensive bill. Now there's probably going to be some disagreement about how we get to that national standard, but we can agree on one thing: it is past time for Congress to take up this issue to take action to pass a bill and see that bill signed into law. We should also acknowledge how closely this issue is tied to the safety of our children online. Senator Blumenthal and I have worked diligently on the Kids Online Safety Act, which would require platforms to design their product for children's well-being in mind, not just for their bottom line. We've seen time and again how data-driven algorithms target kids with addictive content and expose them to harmful material. Business models that profit from children's vulnerabilities must be reined in. It is absolutely disgusting that our children are the product when they are online.
And through the Open App Market Act that I introduced with Senator Klobuchar, I have worked to increase competition and consumer choice in the digital marketplace. Whether it's protecting your personal data, your right to download the apps you want, or your ability to access services, the common thread is this: users, not tech giants, should be in control of the individual user's life. Today's hearing, we'll explore core principles that should go into a national data privacy framework that reflect American values. We'll ask what categories of personal data deserve background protection? How can we give consumers real control over how their data is used and how do we ensure that AI systems, which are only growing more powerful, are accessing and using consumers' data and information in a responsible way.
As artificial intelligence becomes increasingly embedded in everyday life from how we shop to how we work, communicate, and make decisions, Americans deserve to know when, where, and how their data is being used to shape their online experiences. We have an opportunity and a responsibility to get this right and I am looking forward to your testimony today and to the questions that we will have as we move forward. Senator Klobuchar, you're recognized.
Sen. Amy Klobuchar (D-MN):
Well, thank you very much, Chair Blackburn, and thank you to all of our witnesses. And I'm really grateful for your leadership on these issues, Madam Chair, and your willingness to work with me and Senator Blumenthal and many others. We all know new technologies have made it easier for people to monitor their health ... a-ha, there I've got one ... collaborate with colleagues, communicate with loved ones and more, but federal law doesn't do enough, as we all know, to address the privacy that come with these innovations, the privacy concerns. Technology companies collect an enormous amount of personal information about our daily lives. They know what we buy, who our friends are, where we live, where we work and travel, even how much we would be willing to pay for something. Yet for too long, the big tech companies, many of which dominate the market that they operate in, have been telling American consumers, "Just trust us," even though their business models are designed to collect personal information and to use it for profit.
The bottom line is that we are the product. We are. And that's how many tech companies make their money and a lot of it. In 2024, Google and Meta earned a combined $420 billion in advertising revenues alone and they made a lot more money because Americans lack privacy protections, and Americans' data earn Meta $68 in a single quarter last year. Think about that. All these people who don't realize that they're being tracked. But a European Facebook user with a comprehensive privacy protection only generated $23 and that money can be used for a lot of other things that people need right now. And it seems like every day we hear a new story about companies playing fast and loose with data and taking advantage of customers.
Earlier this year, a whistleblower from Facebook, now Meta, testified to another subcommittee about how the company would track users so closely that it could identify when teenage girls felt emotionally vulnerable and then target them with ads exploiting these emotions. For example, when a teenage girl would delete a selfie, Facebook might serve her an ad for diet products. Criminals also view huge troves of data as attractive targets for hacking. We've seen major data breaches ranging from the 2017 Equifax data breach that exposed sensitive financial information from more than 140 million individuals to the hack of Change Healthcare affecting 190 million people and causing more than 100 electronic systems vital to the US healthcare system to be shut down.
On my way here, I was on the phone with the mayor of St. Paul, Minnesota because they, like so many other jurisdictions, are responding to a targeted cyber attack on their IT infrastructure which has shut down some of the city's digital services and may have compromised city-employed data. Once in the hands of criminals, data can be used for everything from identity theft to more serious crimes. And we all learned too tragically with the horrific murders in my state of my good friend, Melissa Hortman, a former Speaker of the House, and her husband, Mark, how accessible personal data is, including people's addresses because the murderer only killed the people and went to the houses of people whose addresses he had. Businesses are also using personal data collected across the internet in novel ways, such as to set individualized prices designed to increase costs for consumers.
Should a person, and this is a question we have to ask as senators, really have to submit to this kind of intrusive data collection just to send a message to a friend online or to book a flight or to order some diapers? I don't think so. That's why more than 20 states have stepped in. I suspect today we'll hear from some of our witnesses about the patchwork of state laws. I agree. It's a problem. But I believe we should have passed privacy legislation many, many years ago. I advocated for it back then. We tried. And in fact, in 2019, I introduced a comprehensive privacy bill. I was a co-sponsor of Senator Cantwell and Cathy McMorris Rodgers, a former Republican House member. The bill would've required companies to collect only the information necessary to provide the goods and services that consumers sought, ensured consumers consented before their personal data was shared with third parties, and put consumers in control of their data by allowing them to access, correct, and even delete personal data.
But many of the businesses that today complain about the burden of complying with the patchwork of state laws, I have the advantage of having been there then, even before Maria Cantwell's bill was introduced, when the companies were lobbying against a federal privacy law. And now they're back complaining about the patchwork of laws and I would like to change that, but I do think it's important to know that's why we're in the position that we are and to understand why some of these states are looking at this going, "Wait a minute." The need for federal privacy reform is even more urgent as AI continues to expand its role into our lives. Data is both the gasoline and the engine for AI models. That means that demand for our data is skyrocketing, so it is critical that we set guardrails to ensure the data that powers AI is responsibly sourced and used for legitimate means and protected when you want to have it protected.
Luckily, there is a bipartisan agreement that Congress needs to act. The Commerce Committee on which Chair Blackburn and I also sit has seen strong bipartisan bicameral proposals for federal privacy reform. Not everyone agrees with all of them, but there has been some start out of that committee and I look forward to hearing from our witnesses about why we need these guardrails now. Thank you, Senator Blackburn.
Sen. Marsha Blackburn (R-TN):
I thank you and our witnesses. Ms. Kate Goodloe is managing director at the Business Software Alliance, where she develops policies on privacy, AI, and law enforcement access. She also taught AI law at the GW Law School prior to her time at BSA. Ms. Goodloe was a senior associate at Covington and Burling focusing on privacy and cybersecurity. She earned her JD from the New York University School of Law. We welcome you. Mr. Joel Thayer is the president of the Digital Progress Institute and founder of Thayer, PLLC. He has represented clients before the FCC, FTC, and federal courts on issues relating to telecom law, data privacy, cybersecurity, and competition policy. Before that, he has held positions at The App Association, the FCC, the FTC, and the US House of Representatives. Since earning his JD from American University Washington College of Law, he has been recognized as a Super Lawyer's Rising Star for his work in communications, law, and digital policy.
Mr. Paul Martino is a partner at Hunton Andrews Kurth LLP. He has nearly 25 years of experience in public policy and government relations specializing in privacy, data security, AI, E-commerce, and tech. Mr. Martino is the founder and general counsel of the Main Street Privacy Coalition. Before joining Hunton, he served as VP and senior policy counsel for the National Retail Federation and co-chaired the Privacy and Data Security Task Force at Alston and Bird. After earning his JD from the University of California Berkeley School of Law, he served as majority counsel on the Senate Commerce Committee for then-Chairman John McCain. Alan Butler is the executive director and president of the Electronic Privacy Information Center. Before his role as executive director, he managed EPIC's Litigation and Amicus program where he filed briefs before the US Supreme Court and other appellate courts in privacy and civil liberties cases. After earning his JD from UCLA School of Law, he was admitted to the DC Bar and the State Bar of California.
Samuel Levine is a senior fellow at the Berkley Center for Consumer Law and Economic Justice. He previously served as director of the Federal Trade Commission Bureau of Consumer Protection. Prior to his role at the FTC, Mr. Levine served as an attorney advisor to Commissioner Chopra as an attorney in the FTC's Midwest Regional Office and as an assistant attorney general in Illinois. After earning his JD from Harvard Law, he clerked on the US District Court for the Northern District of Illinois. We welcome each of you for being here. Now I'm going to ask you to rise and raise your right hand. Do you swear that the testimony that you're about to give before this committee is the truth, the whole truth, and nothing but the truth, so help you God? And we will note that everyone has answered in the affirmative. Okay, Ms Goodloe, you are recognized for five minutes and we'll go right down the line.
Kate Goodloe:
Good afternoon, Chair Blackburn, Ranking Member Klobuchar, and members of the subcommittee. My name is Kate Goodloe. I'm managing director at the Business Software Alliance, or BSA. BSA Members create the business-to-business technologies used by companies across industries. Privacy and security are core to our members' operations. I commend the subcommittee for convening today's hearing and I thank you for the opportunity to testify. The United States needs a strong, clear, comprehensive consumer privacy law. BSA has been a longtime supporter of adopting a federal privacy law. Americans share their personal information online every day. Whether we shop online, use apps to track our workouts, take ride-shares, or host video calls with friends and family, we provide personal information to a broad range of companies. Consumers deserve to know their data is used responsibly.
In our view, a federal privacy law should achieve three goals. First, it should require companies to handle consumers' personal data responsibly and assign obligations to companies based on their role in handling that data. Second, it should give consumers new rights. And third, it should create strong, consistent enforcement. I want to focus on that first goal to create the right set of obligations. A privacy law must recognize different types of companies handle consumers' data. Those companies must all adopt strong but different safeguards to effectively protect consumers. Most importantly, not all companies are consumer-facing. BSA represents the business-to-business technology providers that work for companies across the economy.
An online store that sells clothing, for example, will rely on a series of business-to-business technology providers. It may use one to manage customer service inquiries, another to track deliveries, and a third to protect its data against cybersecurity threats. Each company must protect the personal data it handles, but companies need to take different actions to effectively protect consumers because they play different roles in handling their data. Laws should not create a one-size-fits-all obligation treating an online store and its cybersecurity vendor alike. Doing so actually creates new privacy and security risks for consumers.
Now this is something that states get right. 20 states, both red and blue, have adopted comprehensive consumer privacy laws and those laws are remarkably consistent. All 20 reflect a fundamental distinction between two types of companies that handle consumers' data and assign strong but different obligations to each. The first are controllers. These companies decide how and why to collect a consumers' personal data. And state laws give them obligations about those decisions, including telling consumers how and why they process data, responding to consumer rights requests, asking for consent to process sensitive personal data, and minimizing the collection and use of data in the first place.
The second type are processors. These companies have a role of handling data on behalf of a controller and state laws give them a common set of obligations too. Those include processing data pursuant to the controller's instructions, entering into a contract with a controller, handling the data confidentially, and giving controller the information it needs to conduct privacy assessments. These roles reflect the modern economy. They're not unique to state laws and they're not new. The distinction between controllers and processors dates back more than 40 years and it underpins privacy laws worldwide. It must be part of any federal privacy law. In addition to putting obligations on companies, a federal privacy law should create new rights for consumers and strong consistent enforcement. Here, too, you can look to states. There is widespread agreement on consumer rights. All 20 states give consumers rights to access, delete, and port their personal data. 19 also give a right to correct inaccurate data. States also create similar enforcement mechanisms, with all 20 giving a leading role to the attorney general to enforce privacy violations.
I look forward to discussing consistent aspects of these state laws, but I want to say that consistency may not last. This year, we've seen a striking interest in amending existing laws to revise, expand, and change their protections and new obligations coming through rule-makings. A federal law is needed to bring consistency to existing protections and to create broad, long-lasting protections for consumers. A federal law should not weaken protections already provided by the states, but extend those protections to consumers nationwide. There is significant common ground between industry and civil society stakeholders on comprehensive federal privacy protections. We look forward to working with Congress on these issues. Thank you and I look forward to your questions.
Sen. Marsha Blackburn (R-TN):
And well done. Right at five minutes. You get a gold star on that. Mr. Thayer?
Joel Thayer:
I'll try to emulate it. Thank you, Chairwoman Blackburn, Ranking Member Klobuchar and esteemed members of this committee for inviting me to testify and holding this important hearing. My name is Joel Thayer and I'm the president of the Digital Progress Institute. It's a think tank based in Washington DC focused on promoting bipartisan policies in the tech and telecom space. Ensuring privacy for all is a founding principle of the institute and as such, I very much appreciate the committee's commitment to building out a privacy framework that further assures that the integrity and ownership of our digital selves remains in our domain, not by a company with a domain name.
Although our privacy from our government is well established, that is unfortunately not the case with respect to companies with the Allora free services, we provide details about our most intimate selves to trillion-dollar tech companies who in turn make enormous profit off data they collect. They know everything about us: what we like to eat, when we sleep, where we live, where we are, our beliefs, and even our fears. Curiously though, they claim our age confounds them, but let's set that aside for now. A recent Pew study shows that 73% of Americans feel they have limited to no control over how companies use their personal information and the reality is they don't. We sign privacy policies that are filled with so much legal jargon that it may as well be unintelligible to the average person. And presto, our data is now their data.
The problem is not just they sell our data to third-party advertisers, but also to those who use our data to create fake images, curate biased news feeds, conduct elaborate scams, and even engage in espionage campaigns. In short, we are not in control and Americans are right to be concerned. And with the advent of AI, this trend is only going to increase. It's no wonder why 85% of people want more privacy protections. We need government intervention here. The good news is that protecting privacy is a bipartisan issue. Indeed, 20 states across the political spectrum have passed privacy laws. And as evidenced by this hearing, Congress appears poised to address this issue again.
PART 1 OF 4 ENDS [00:25:04]
Joel Thayer:
And as evidenced by this hearing, Congress appears poised to address this issue again. We welcome this much-needed development.
With that in mind, here are a few high-level suggestions as the committee evaluates paths forward.
First, it's important to define your goals and keep the framework targeted at accomplishing its goals. One of the primary issues with previous attempts at passing meaningful privacy laws has been that bills attempt to do too much all at once. We have seen the most success in legislation that has clearly articulated goals with targeted solutions. It's why the institute has supported targeted bipartisan measures, such as the Protecting Americans from Foreign Adversary Controlled Applications Act, that's a mouthful, the TAKE IT DOWN Act, the Kids Online Safety Act, the App Store Accountability Act, and OAMA, just to name a few. As we have seen in the EU's GDPR, overly sweeping privacy laws have the unintended consequence of entrenching incumbents. The GDPR should be a cautionary tale for the US because it clearly shows that privacy regulations without market guardrails can seriously exacerbate today's competition issues we have with big tech.
Second, enforcement matters. In our experience, agency actions or attorney general enforcement are the most effective, whereas a private right of action alone may act more as a carrot as opposed to a stick, giving these companies seemingly endless teams of lawyers and budgets. For instance, the Texas Attorney General recently secured a $1.4 billion settlement against Google for violating its privacy law, whereas when consumers sued Apple under California's privacy law, in part for sharing recorded conversations that included personal health information with their physician to medical ad companies, they were only entitled to a meager $95 million. Worse, consumers won't see about a third of that because that's reserved for their lawyers.
Third, the broader the federal statute, the more important preemption will become. That's because targeted legislation is less likely to run into differing state privacy regimes. Any preemption framework should be clear on what it is preempting and should reserve rights for state attorney general enforcement. Key areas, though, ripe for preemption are addressing basic definitions, like what does personal information mean; the creation of data rights, it seems to be unanimous amongst all state privacy laws; and, of course, be specific with what data management practice we seek to prohibit.
In sum, the reality is that if these big-tech companies cared about user privacy, they would protect it. Frankly, it's in their interest not to. Congress needs to act. Once again, I would like to thank the subcommittee for allowing me to testify and I welcome any questions you may have. Two seconds on this one.
Sen. Marsha Blackburn (R-TN):
There you go. Well done. Mr. Martino, the pressure is on.
Paul Martino:
Thank you, Chair Blackburn and Ranking Member Klobuchar for the invitation to be here today. I am Paul Martino, a partner at Hunton Andrews Kurth here in Washington, and I serve as the general counsel for the Main Street Privacy Coalition.
Our coalition members represent a broad array of companies that line America's main streets. They interact with consumers each day. They're found in every town, city, and state, providing jobs, supporting our economy, and serving Americans as a vital part of their communities. Collectively, main street businesses directly employ approximately 34 million Americans and contribute $4.5 trillion to our nation's GDP. Since 2019, the coalition has supported federal privacy legislation that would establish a single nationwide law to protect the privacy of all Americans.
Where we sit here on Capitol Hill today, we can travel to two states in 20 minutes by car or metro, just like many Americans who live in tri-state areas or near state lines. Should Americans' privacy rights change as they drive from D.C. into Maryland or Virginia? They do right now, but many don't know that. Americans expect their privacy to be protected the same everywhere.
Our coalition members share a strong conviction that a preemptive federal privacy level will benefit consumers and main street businesses alike. It would give consumers confidence that their data will be uniformly protected across America regardless of where they live or choose to do business, and it would provide the certainty main street businesses need to lawfully and responsibly use data to better serve their customers online or across state lines. Establishing a uniform national law that extends consumer privacy rights and consistent privacy rules to all consumers and businesses in America is a core principle for Main Street. I will highlight two more.
First, a federal privacy law should protect consumers comprehensively with equivalent standards for all businesses. A privacy law should empower consumers to control their personal data used by businesses regardless of business type. Likewise, businesses must be permitted to lawfully use data consumers share with them to better serve customer needs. To meet these goals, we recommend a federal privacy law that creates equivalent privacy obligations for all businesses handling consumer data. This would be a change from past federal privacy bills that narrowed obligations for service providers in big tech, telecom, cable, and financial industries, relieving them from the same obligations that apply to main street businesses.
For privacy laws to succeed for consumers, it is critical for all entities handling consumer data to secure that data and protect the privacy rights. This is true regardless of the terms used in privacy laws that blur the reality of who actually controls the data. The label controller, which is applied to every main street business that directly serves a customer, can create a false impression about the power of main street businesses as they interact with big-tech service providers. Main street companies control their relationship to customers, a responsibility they value, but very few can control how nationwide service providers operate and do business. Powerful big-tech and ISP service providers require main street businesses to sign, take-it-or-leave-it contracts that dictate the terms of their service. The myth that big-tech processors merely follow the instructions of the typical main street business is not credible. Privacy laws should not permit any industry sector to shift its responsibilities onto another.
Ensuring equivalent data privacy obligations across industry sectors is also inherently pro-consumer. Consumers have the right to expect privacy rules they can understand, predict, and support that meet their expectations. Congress can pass a law to ensure that all businesses protect consumer's privacy and processors cannot hide behind labels that make it appear they have no control at all.
Finally, federal privacy laws should hold accountable all entities handling personal data with the same enforcement mechanisms. This creates an even playing field with proper incentives across industry. The law should encourage compliance that protects consumers more effectively than gotcha lawsuits that threaten main street businesses striving to be in compliance. This is why state privacy laws thoughtfully couple government notice with the opportunity to quickly correct or cure mistakes.
Thank you and I welcome your questions .
Sen. Marsha Blackburn (R-TN):
And you came in with a few seconds on the clock. You're in the lead. All right, Mr. Butler, we're going to see what you can do here.
Alan Butler:
Thank you, Chair Blackburn and Ranking Member Klobuchar and members of the subcommittee for the opportunity to testify today-
Sen. Marsha Blackburn (R-TN):
Microphone.
Alan Butler:
Sorry. For the opportunity to testify today about the need to better safeguard Americans' online data.
My name is Alan Butler and I'm the Executive Director at the Electronic Privacy Information Center. EPIC is an independent nonprofit research organization established in 1994 to secure the right to privacy in the digital age for all people.
Twenty-five years ago, the Federal Trade Commission issued a report to Congress based on its research of privacy risks in the online marketplace. The takeaway was clear: self-regulation does not work, and we need legislation to ensure adequate protection for Americans online. In the decades since that report, we have seen our digital world expand and develop in amazing ways, but without strong privacy protections, we have seen an alarming expansion of surveillance and data abuses online that threaten our rights and subvert our most fundamental values of autonomy and freedom.
The status quo is untenable. If the law allows a company to scrape images of all of us to build a universal facial recognition database, while another company tracks every site we visit to build invasive profiles, and yet another company buys and sells our logs of daily movements, do we have privacy protection at all? I believe any reasonable person would say "no" and would demand that our lawmakers step in to fix this broken system.
In my testimony today, I will describe the current state of state privacy law and identify the areas where federal leadership would be most impactful.
Privacy is a fundamental right, and Americans deserve a law that actually protects our data. In the absence of action by Congress, states have stepped into advanced digital rights in the information age. This has been an important catalyst for change, but there's more work ahead to establish robust privacy standards. There is significant bipartisan agreement across party and state lines about the need for privacy protection and the core principles that should shape the law. So our attention at the federal level should be on establishing clear rules of the road to make our digital world safer and more secure. What we cannot do is pass a weak federal standard that prevents states from responding to new challenges and emerging threats in the future. A federal privacy law should set a consistent and robust standard for protection while preserving flexibility for states in the future.
Over the past seven years, 19 states have passed comprehensive data privacy laws, and many states have also passed bills aimed at preventing specific privacy harms. Most of these state laws follow a common framework and have many of the key components of any modern privacy law, but unfortunately, these laws do very little to actually limit abusive data practices and to protect privacy. In a recent report, EPIC analyzed these laws in detail and graded each of them. Eight received Fs and none received an A.
So, what went wrong? The tech industry has invested heavily in state lobbying to water down the substantive protections, narrow their scope, and add exceptions that swallow the rules, but over the last two years, we have seen stronger state proposals building off the bipartisan framework that Congress created in 2019 and 2021. The Maryland Online Data Privacy Act, for example, passed last year. It builds on existing state laws and incorporates strong data minimization protections, and a ban on the sale of sensitive data. Inspired by Maryland's success, 10 states have introduced bills with strong data minimization rules this year. Several states that originally passed weak privacy laws have revisited and amended their laws to strengthen their protection.
Any federal privacy proposal should have a strong data minimization rule, include heightened protections for sensitive data, and establish robust enforcement mechanisms.
Data minimization offers a practical solution to our broken internet ecosystem. Instead of allowing data collectors to dictate privacy terms, data minimization rules set clear standards to limit the processing of our data. Companies can collect the data they need to provide the services we want. This standard better aligns businesses' conduct with what consumers expect and stops abusive data practices, like third-party tracking and profiling.
Enhanced protections can also ensure that our most sensitive data remains confidential and secure. So much information about us that has traditionally remained private is now captured in digital form: our health records, our movements, our biometrics and genetic markers, even the data about our children. These records are frequently targeted by hackers and scammers and should be locked down and secure.
Strong privacy standards should also be backed up by robust enforcement, including the three-tiered approach that we saw in the federal bill. And while state and federal enforcement is essential, the scope of data collection online is simply too vast for any one entity to regulate, and that is why private rights of action with enforceable court orders are so important.
EPIC has been calling on Congress to pass a strong privacy law to protect all Americans for the past 25 years. We are grateful that the subcommittee is turning its attention to this important issue and we urge federal lawmakers to learn from states' experience.
I thank you for the opportunity to testify today and I look forward to your questions.
Sen. Marsha Blackburn (R-TN):
And Mr. Levine, you're recognized.
Sam Levine:
Thank you, Senator. My name is Sam Levine, and I'm a Senior Fellow of Berkeley's Center for Consumer Law & Economic Justice. Until January, I led the FTC's Bureau of Consumer Protection.
Today, protecting Americans' personal information is about much more than privacy. It's about whether we can afford essential goods, whether we can be profiled based on our political or religious beliefs, and whether the next generation will grow up addicted to screens. I'll be focusing on three real-world threats that unchecked privacy abuses are fueling: threats to economic fairness, democratic freedoms, and the safety of kids and teens.
Let's start with economic fairness. On a recent earnings call, Delta Air Lines executives boasted they could soon raise prices on plane tickets, not by adding value, but through a new formula: stop matching competitors prices, unbundle basic services, and charge each passenger the most they're willing to pay. Investors cheered the news, calling this the holy grail, but we should call it what it is, personalized price gouging, and it's only possible because weak privacy protections are allowing companies to track our behavior and predict how much we can be pushed to pay.
This practice, also known as surveillance pricing, is spreading. More and more businesses are looking to price everyday goods, from groceries to hardware, the way airlines are pricing tickets. And let's be clear, their goal is not to lower prices. It's to charge each person as much as possible, and the people hit hardest will be those with the fewest options: a parent buying baby formula, a senior filling a prescription, or a family booking last-minute travel to a funeral. Unchecked data collection is moving us from a world of one product, one price to one person, one price, and if we don't act, the shift will be costly.
Unchecked data collection is also putting our democratic freedoms at risk. Last year, the federal government alleged that an entity was tracking Americans' movements and profiling them into categories, like Wisconsin Christian churchgoers, likely Republican voters, and restaurant visitor during COVID quarantine. This was not a foreign adversary. This was a US data broker. The FTC sued to halt these practices. That lawsuit should be a wake-up call. No American should be profiled based on their politics, their religion, or their stance on COVID lockdowns. Yet without strong data protections, that's exactly what brokers are doing. Political and religious freedom cannot thrive in a society where our movements, beliefs, and behaviors are tracked, recorded, and then sold to the highest bidder. We need to act.
We also need to act to protect our next generation. Over the past two decades, big tech has been running a massive experiment on our children: what excites them, what enrages them, and what holds their attention. The result is a youth mental health crisis. Weak data privacy is powering these harms. Social media companies collect personal data to power their ad-driven business models. More screen time means more revenue and more insights into how to keep kids hooked. It's a dangerous feedback loop that profits from addiction and it's getting worse.
Today, companies are building AI chatbots engineered to earn kids' trust and keep them engaged, and that means serving up content that's provocative, obscene, and sometimes dangerous. One bot reportedly told a teen that self-harm feels good. Another offered lessons on how kids can hide drugs and alcohol and how to set the mood for sex with an adult. You might expect these incidents to prompt a pause, but the opposite is happening. The same tech giants that have been putting kids at risk for years are now racing to roll out AI chatbots, and respectfully, they are doing so because Congress is not telling them they need to stop. That must change.
Across each of these threats, the common thread is weak data protection, but we can fight back. Strong privacy laws can stop companies from using personal data to set individualized prices, ban the profiling of Americans based on sensitive information, and end the surveillance that's fueling an endless cycle of harm to kids and teens.
Thank you for holding this important hearing today, and I look forward to taking your questions.
Sen. Marsha Blackburn (R-TN):
And you win the gold medal.
Sam Levine:
Thank you.
Sen. Marsha Blackburn (R-TN):
Yes, I think it was 23 seconds left. We're going to move to questions, and Senator Klobuchar, I will let you begin.
Sen. Amy Klobuchar (D-MN):
Okay, very good. Thank you very much.
So, as I discussed earlier, there've been a number of bipartisan proposals for a federal data privacy law that have been introduced over the years, including the American Privacy Rights Act and the American Data Privacy and Protection Act. I guess, Mr. Butler, I will start with you. Why is it so essential that we put reforms like these in place for consumers across the country?
Alan Butler:
Well, thank you for the question, Senator Klobuchar. I mean, we've seen what happens without federal leadership on privacy. Surveillance tools have become embedded in every website and app that we visit, and without a federal standard, companies really don't have the incentive to innovate on privacy protection, and a few big-tech firms dominate the marketplace. So we're fueling harms to individuals, we're fueling harms to the market, and we're just allowing ourselves to be inundated by these surveillance and abusive data collection practices.
Sen. Amy Klobuchar (D-MN):
Thank you. And Ms. Goodloe, in your testimony, you highlight that there's broad consensus on many privacy principles across the 20 states that have them, both Democratic and Republican-led. I think Mr. Butler was mentioning how some of the early laws were weaker, there have been some improvements. What are the significant areas of bipartisan consensus that should be at the core of federal privacy legislation?
Kate Goodloe:
Thank you for the question. We see a lot of consensus on the right set of rights to give to consumers, both affirmative rights like the ability to access, correct, and delete their information, and on giving them rights to opt out of certain activities, including the sale of their data, profiling, and targeted advertising. I think there is consensus among many, most, of these state privacy laws on that set of important issues.
There's also a core set of obligations on companies. For controllers, it's things like asking for consent to process sensitive data. We have 17 states that require companies that are processing sensitive data to conduct privacy assessments, looking at the sensitive issues arising from that processing. And when it comes to processors, there is broad consensus that they have a separate set of rights to handle data on behalf of a controller pursuant to their instructions and to do so confidentially.
Sen. Amy Klobuchar (D-MN):
Okay, thank you. Mr. Levine, while at the FTC, you prosecuted unfair and deceptive acts and practices related to data privacy as well as other privacy laws, like those intended to protect young children. Despite your efforts to use every legal tool at your disposal to protect privacy, what gaps exist that are the most critical for Congress to fill through a comprehensive data privacy bill?
Sam Levine:
Well, thank you for the question, Senator. And as you alluded to in your remarks, we currently live under a privacy regime where companies have taken the position that they can basically do whatever they want, so long as they disclose it in their privacy policy.
Over the last four years at the FTC, we took a number of steps to try to push back against that. We told GoodRx they couldn't share sensitive medication information with Facebook, even if consumers clicked "yes." We told BetterHelp it couldn't share with advertisers what mental health treatments people were seeking. We told Amazon Ring that its employees couldn't spy on people who were using their security cameras. But I can tell you, Senator, that every case we brought, when I would meet with counsel for those companies, they would tell us the same thing, "Well, we put it in our privacy policy, so it's legal."
I think our enforcement, the FTC's enforcement and state enforcement and privacy enforcement would be far more effective with bright-line rules on what companies can collect, how they can use it, and with whom it can be shared. Without that, you're going to continue to see a whack-a-mole approach that doesn't do enough to protect Americans' privacy.
Sen. Amy Klobuchar (D-MN):
Thank you. Very good. Mr. Thayer, I've long advocated for common-sense rules to require the platforms to allow competing businesses the same access to the platform that they give themselves. Senator Blackburn has advocated for similar reforms in app store markets, but as you mentioned in your testimony, dominant platforms use privacy concerns as a pretext to avoid opening up their platforms to fair competition. How can interoperability requirements be implemented without putting user privacy at risk?
Joel Thayer:
Thank you for the question, Senator, and also, thank you for your work that you do on this. And also, Senator Blackburn, you guys have been real champions on this issue, and I think it really does highlight the significant aspects in the concentration that this market involves, where we have basically four players, maybe three in some markets or maybe even two in others, particularly in app store, where you're at the behest or at the whim of whatever these companies want you to do, so you're basically stuck with whatever privacy policies that they decide on.
And so a good example of this is the stuff we're seeing at the DOJ with AG Gail Slater at the helm, where she's been arguing on the remedies case, and the first argument that you got from Google was like, "Hey, you can't do this sharing arrangement because it'll violate privacy." But in reality, what they really care about is scale. They want to harbor the data. They don't really care about the privacy at all. It's really all a ruse.
Sen. Amy Klobuchar (D-MN):
And how can a strong federal privacy law help ensure that interoperability opens up digital markets to competition?
Joel Thayer:
I really point to the idea of a general statute versus a specific statute. And as you know, Senator, the antitrust laws are pretty broad, and so Section 5 of the FTC Act, being able to designate exactly what we're interested in and target the actual acts that we're concerned with will help regulators down the road, and this is precisely what Mr. Levine was alluding to, when bringing that broader framework out. If we say interop is something that we all believe is something that could equal out or balance out the scales, then it gives the regulator the ability to assess it in that way instead of using vague statutes.
Sen. Amy Klobuchar (D-MN):
Okay, last question. Mr. Martino, as you know, I was close friends with John McCain and miss him very much. In your written testimony, you say that businesses should not be responsible for the data privacy practices of other entities whose actions they cannot control, including the big-tech platforms on which we know many businesses now have to rely to reach consumers. How can Congress ensure that responsibility is aligned properly with the entities best suited to protect consumer privacy?
Paul Martino:
Thank you, Senator Klobuchar. Well, I think the core principle we have here is that businesses need to have equivalent requirements, equivalent standards to protect data. There's a chart in my testimony that-
Sen. Amy Klobuchar (D-MN):
We like charts.
Paul Martino:
Yeah, you like charts?
Sen. Amy Klobuchar (D-MN):
They're always fun.
Paul Martino:
I didn't make it real big though, sorry.
Sen. Amy Klobuchar (D-MN):
Oh, oh.
Paul Martino:
But it shows some of the state law requirements for the big-tech service providers. And you'll notice there are a couple of red Xs here on things that I think consumers would expect and businesses like mainstream businesses would expect their service providers to do, which is provide data security. The state laws for the most part except for Colorado, I believe, don't require the big tech service providers to actually secure the data they're processing on behalf of businesses. They're only required to assist the controllers in their own data security and if they have their own breach, but there's a lack of parity there. Another place that I'll mention where there's a red X, and again, only I think Colorado and Connecticut have done this, but processors use lots of sub-processors or sub-contractors and they have requirements that any sub-processor they share the data with has to meet the same standards as the processor.
But they don't give the mainstream business an opportunity to object to those sub-processors or those sub-contractors, only in two states that I'm aware of. And that is a big difference between, for example, what happens in Europe and what happens in the U.S. And so if you have a processor that you don't want to downstream pass on data to, think of some of the past breaches and privacy violations we've seen before, the mainstream business should have the ability to object to that. So we ask for the similar requirements that mainstream businesses have to live by.
Sen. Amy Klobuchar (D-MN):
Okay, thanks. And thank you. Sorry to go over.
Paul Martino:
No, thank you.
Sen. Marsha Blackburn (R-TN):
It is perfectly fine that you went over. This is the first of our hearings that are going to look at this virtual space. And as you all know, Senator Klobuchar and I have done a lot of work in trying to secure the American citizens' privacy in the virtual space. And as we work through this on this committee, I think that foundational to the conversations is who owns an internet user's data and what is the scope of that ownership? Where does it begin? Where does it end? And let's just go down the line, Ms. Goodloe starting with you. And everybody keep it under a minute and answer that question so that we've got that for the record.
Kate Goodloe:
Thank you for the question. Our companies provide business-to-business technologies to other companies. In many cases, their business customers own the data that they store with BtoB providers. And yeah, there may be personal data that individuals own as well, and those individuals should have rights like to access, correct and delete that information no matter whether it's stored with a consumer-facing company or the business-to-business provider processing it on behalf of that consumer-facing company.
Sen. Marsha Blackburn (R-TN):
Okay. Mr. Thayer.
Joel Thayer:
Given the lack of appropriate consent regimes, I would say that the user owns that data because I don't think that the way we have things set up right now, the data subject doesn't even know that they've given over some of that data. And so at the end of the day, I think the reality is that we have to have privacy regimes in place to ensure that the ownership they outline those particular contours, but it is 100% you own your data and it shouldn't be the other way around.
Sen. Marsha Blackburn (R-TN):
Okay.
Paul Martino:
Thank you Senator for your question. It's a very good question. There are some nuances here I think that are important. First, main street businesses understand it's the user's data and the user has the right to correct it, delete it, remove it from their system. But there are some kinds of data that is considered shared. And so for example, if you make a purchase in a store, well the store needs to keep a record of that purchase if you want to do a return or an exchange for their inventory. So this consumer made this purchase on this date, is that personal information? Yes. Is it also information the business needs and can't just get rid of? Yes. And so I think when it comes down to ownership, we just have to understand that in modern commerce and e-commerce, some information will need to be retained but only for as long as it's necessary to retain it. And I think hopefully that answers the question.
Sen. Marsha Blackburn (R-TN):
Yeah.
Alan Butler:
Thank you for the question Chair Blackburn. We believe that we all have a fundamental right to control when our data is used and how it is collected, but individual mechanisms of consent and control don't provide a complete solution to this problem, and that's why we feel that it is so important to have rules of the road that protect people's privacy by default and align business collection and use data practices with what consumers reasonably expect.
Sen. Marsha Blackburn (R-TN):
Okay.
Sam Levine:
Thank you Senator. I very much agree with Mr. Butler, data about people should be owned by people. But at the same time, as Alan said, we don't want a world in which people are solely responsible for protecting their own privacy. That's why we need strong federal protections that don't put the onus on people, but put the onus on companies to make sure they're not abusing people's privacy.
Sen. Marsha Blackburn (R-TN):
It was over a decade ago that now Senator Welch and I were in the House at the Energy and Commerce Committee. I know Mr. Martino remembers all of this, and we had a bipartisan legislation to establish a data privacy framework. And of course, big tech fought it, all the way to today we still don't have it into law. So Mr. Thayer, talk for a minute about why big tech has found it so vitally important to kill any effort to have federal online privacy.
Joel Thayer:
Because it's against their financial interest to actually be regulated. I mean that's the obvious answer, but in reality what you're pointing out, and I think everyone on this subcommittee has experienced, it doesn't matter how tailored you make your legislation, it doesn't matter how measured, they will find some reason and put something forward. If you want to do antitrust reform for instance, they'll say there's a privacy violation. If you say there's privacy, then we don't have to worry about competition. It's always this game of Whack-a-mole. And so at the end of the day, they like the way things are because it benefits them. The market is basically created for them. And so I think this is exactly why we have strong advocates fighting for things like the Kids' Online Safety Act where you have parents begging Congress to do something and we're seeing the harms play out right in front of us.
I think at this point we've recognized that big tech is in the Emperor has no Pants moment and we are all starting to see that we absolutely need the reforms. And so things like the Open App Markets Act are going to be very helpful to quell any of those privacy concerns. The Kids Online Safety Act, I think will do a lot to find measure-targeted approaches that will ultimately help kids. But again, I think that the waves are changing and I think that I'm very hopeful. And things that I'm seeing at the DOJ, especially from the first Trump administration to the Biden administration out the gate to the new Trump administration, it seems as if everyone has identified that these companies are bad actors and they should not be trusted. So I hope whatever advocacy I can provide would be to outline that this is really just... Don't fall for the red herrings. Ultimately the side of right is to protect consumers and big tech has no interest in doing that.
Sen. Marsha Blackburn (R-TN):
Ms. Goodloe, I want to come back to you. In your testimony you talked about state laws and the importance of some of those state laws. I want you to define a couple of the common elements that you have seen in the state laws that could be transferred into a federal law that should be broadly supported and accepted.
Kate Goodloe:
Thank you for the question. I think the states provide a lot of common ground for Congress to look to as it works toward federal privacy legislation. That common ground exists on things like the consumer rights that we've talked about today, rights to access, correct, delete and port your data to another service. Rights to opt out of the sale of your data. Targeted advertising. Certain types of profiling. And states are unanimous on recognizing there are different types of companies that handle consumer's data. One set of obligations should be assigned to controllers who decide how and why to collect a consumer's data, how to use it. And one set of obligations should be put on the processors that handle the data on behalf of controllers.
I also want to take a moment to respond to something that Mr. Martino brought up about what those processors do when they employ other sub-processors because in many cases what processors do is they collect a series of other sub-processors, package it together and are able to provide it to business customers at scale so that their small businesses can enjoy the economies of scale at being able to use cutting edge technologies. That means you are providing the same service to hundreds or thousands of business customers and letting one object to a package of sub-processors, doesn't work.
That's why we haven't seen the majority of states adopt that, which could actually increase security risk to consumers when one of those sub-processors has a breach and they have to go and ask permission to change over the data. But I think we do see broad agreement among the states about the right set of consumer rights and obligations on businesses to safeguard consumer's data and to do so effectively along with a common enforcement system that is a regulatory-led enforcement system to ensure we have consistent expectations for companies that want to comply with privacy and security obligations.
Sen. Marsha Blackburn (R-TN):
Mr. Martino, you wanted to respond?
Paul Martino:
Yes, just on the sub-processor point. And one thing to keep in mind with the ADPPA, that was the predecessor to the APRA, the way the definitions worked, a sub-processor was also defined as a processor. So once it got to a processor, there could be this endless train of data sharing that the main street business has no control over. Well, that might be great for efficiencies of the services that the main processor is providing. There's no check on the downstream, and so that's why all that we've been pushing for was a simple notice to the main street business of the sub-processor you are using and the right to object. It's not like an opt-in that they can't go to them and they can't provide these efficiencies. So that's just, I mean an in-the-weeds point, but I think it's an important point because it's the main street businesses that will be held liable under most of these constructs because the same requirements aren't applying to the processors and the same enforcement mechanisms aren't applying.
I'll make one last point. In the APRA, the private right of action largely applied only to what are called the controllers, but of course these main street businesses that can't really control the big tech companies and it hardly applied to the processors and it didn't apply at all to the third parties. So I think we have to look at not just that these state laws have requirements, but who's subject to them and who's liable for those violations.
Sen. Marsha Blackburn (R-TN):
Okay. You had additional questions. Go ahead.
Sen. Amy Klobuchar (D-MN):
It's really an extraordinary panel, so thank you. I guess I would start with you again Mr. Butler. Over time we've seen that these data privacy frameworks move away from a notice and consent regime to focus on data minimization, transparency, consumer control, opt-out rights. Why is notice and consent insufficient for protecting user privacy?
Alan Butler:
Thank you for the question, Senator Klobuchar. I think notice and consent really takes us back to that self-regulation point that was made in the FTC report 25 years ago because that's essentially what it is, it's a rule set that says so long as you disclose in general terms what you're doing, then the law permits it. And of course the incentives there are clear. You put in your disclosure, everything you could ever potentially-
Sen. Amy Klobuchar (D-MN):
That I never read.
Alan Butler:
... with that data.
Sen. Amy Klobuchar (D-MN):
Says that the senator who decided every morning this week I'm going to spend five minutes pushing unsubscribe on my email and I am still getting, I've cut it in half what I'm getting. Yes, it's a nightmare, right.
Alan Butler:
And it doesn't shift business practices.
Sen. Amy Klobuchar (D-MN):
I know, but it's just really sad. Okay, continue on Mr. Butler.
Alan Butler:
And it doesn't shift business practices and it doesn't change anything about the surveillance that surrounds us and the data collection that pervades, which is why a set of data minimization rules that better align the business practices with the expectations of the users and link the collection and use of data to what the services that people are actually requesting. I think better aligns with those reasons and is a much easier way to solve the problem than as I mentioned earlier, the individual control concept, which then requires us to all make thousands of choices every second of every day and face pop-ups and questions and detailed settings.
Sen. Amy Klobuchar (D-MN):
Yeah, and then you pop the wrong one, suddenly you're in something else.
Alan Butler:
Exactly.
Sen. Amy Klobuchar (D-MN):
Mr. Levine, what barriers does today's notice and consent, a regime that I was just talking to Mr. Butler about for data privacy, create for enforcers?
Sam Levine:
That's a great question, Senator and I alluded to it earlier, it's not only our data privacy cases, but so many of the enforcement actions we brought at the FTC over the last four years, we said, look, you surprised consumers, you misled consumers, you abused their data, you shared what medication they were taking with Facebook. And the company says, hold up. We put it all in our privacy policy and the consumer clicked, I accept before proceeding to use the service. This is a total fiction. It's a total fantasy that consumers can protect themselves by reading privacy policies. And to Mr. Thayer's excellent point, we can draw a direct line between Congress's, in my opinion, inability to pass privacy laws and big tech lobbying. This is the most valuable industry in the history of the planet and they have built their revenue not by selling cars, not by selling oil, but by collecting our data and predicting our behaviors. That's how they've built their valuations. They don't want restrictions in what they can collect, and that's why I think it's so important congress define what they want and actually pass a strong bill.
Sen. Amy Klobuchar (D-MN):
Very good. Mr. Martino, in your written testimony you say that businesses should not be responsible for the data practices. We already went over that, but I guess my second question about that is just what, when you look at the differences between as we look at how we craft this federal law and the states and what stopped us before, how do you think we're going to get around that to get to a place where we can get something done?
Paul Martino:
That's a great question, thank you, Senator. I do think that we start with where the strong consensus of state laws have been. They have outlined, as Ms. Goodloe pointed out, a set of requirements. Our issue has really been with who gets exemptions, who's subject to the liability for violations, and is the law taking care of it. I would say one of the things you can take from the state laws is that they realize there is this imbalance in negotiating power between smaller mainstream businesses and large big tech companies. So they have taken the route of putting statutory requirements in. We're just asking that you build on that framework and add a few more. One of the key issues on the APRA and the ADPPA before it was that there was a big debate over data minimization standards. And when the bill was originally drafted the ADPPA, it was applying to both covered entities, which are like the controllers or mainstream businesses as well as the processors.
But processors and big tech did not support that bill until that data minimization standard was changed to apply only to covered entities or controllers. And that is a fundamental difference. I think while there are very good requirements in state levels and most of the states are in the same place, it is not the case that everyone in the marketplace is handling data and protecting data for consumers and honoring their rights to the same level that is being put on the consumer-facing businesses. And we think Americans expect that their privacy is the same everywhere, as I said in my testimony. And we should have requirements that make that happen. In terms of the politics, if big tech's been fighting some of the previous bills weren't so heavy on them, yes, it's going to be more challenging if bills are more fairly and have equivalent standards, more fairly balanced.
Sen. Amy Klobuchar (D-MN):
One of our best arguments as we look at the politics of this on both sides is affordability. And Mr. Levine, I know you did this study on how the collection of this data can affect affordability. So I look at some of the fresh new arguments we can make to convince our colleagues, which is always fun to do, but we are doing better and better. Could you tell us about that?
Sam Levine:
Well, I think it is a new argument, Senator, because it's a new practice we're seeing more and more companies using. Some people think of privacy as a discrete issue. I have nothing to hide. You have nothing to hide. Privacy is much deeper than that. And what we are finding and what the FTC study found is that companies are using these reams of data they've collected, and they've historically used to target people with advertisements. We know that's been very profitable, but they're suddenly realizing they could target people with individual prices. And they go around and they tell members of Congress and state houses, oh, we're just doing this because we want to lower prices and send people discounts. This is ridiculous. They are paying companies like McKinsey, high-priced pricing consultants to use AI optimization and reams of consumer data to set individual prices.
And they're not doing it to lower their profits. They're not doing it to lower their prices. They're doing it so that they can raise prices on the Americans who are most desperate for goods and services. We have always seen that pricing abuses can start in the airline industry. That is what we are seeing now with Delta, and I have a lot of concern this is going to spread throughout the economy and the early results of our FTC study show that it already is.
Sen. Amy Klobuchar (D-MN):
And we've seen the same thing, mostly with rent, by the way.
Sam Levine:
Absolutely.
Sen. Amy Klobuchar (D-MN):
Collecting of data on rent.
Sen. Marsha Blackburn (R-TN):
Let me jump in on this because we really appreciate having all of you here. On surveillance pricing, just do a show of hands, do you think surveillance pricing should be banned?
Sam Levine:
Yes.
Sen. Marsha Blackburn (R-TN):
Okay.
Paul Martino:
How would you define surveillance pricing?
Sen. Marsha Blackburn (R-TN):
Okay. I know, I know. I just wanted a response. Mr. Martino. Yes.
Paul Martino:
Yeah.
Sen. Marsha Blackburn (R-TN):
Okay. And-
Paul Martino:
For the record, my hand was not up. It was down.
Sen. Amy Klobuchar (D-MN):
You have never worked in that-
Paul Martino:
Asking a question as to what you meant.
Sen. Marsha Blackburn (R-TN):
Yes. I want you to talk then a little bit about shared data, order, history, loyalty programs. And then how long you keep that and how you incent that keeping of the data because that's a choice that somebody makes to enter into that loyalty program.
Paul Martino:
Absolutely, Senator. And in doing so, let me just first address what Mr. Levine said. I know there's the concern that what pricing may happen in one industry or the way those practices go, it may come down to retail. I think there's a very significant difference between the retail industry and let's say some other industries. And it really comes down to competition where you have robust competition like you do in the retail industry and very low profit margins. The goal on retail is volume. It's business, it's attracting new customers, it's growing the business because you have very little profit on each item. And what that leads to is I think, a market constraint. So almost like a de facto regulation in terms of having such severe competition that your competitor is one click or tap away on an app or one stop away. And so what's the mindset of retailers and mainstream businesses is how do I attract more customers?
How do I do that? Well, you have to do that with excellent customer service. I mean, the only way to really differentiate yourself is to do that. And so loyalty plans are one way that it's done. There's a report that I cited to in the testimony called the Bond Brand Loyalty Report. They do it every year. They've been doing it the last 14 or 15 years. They survey consumers, consumers say that 85% of them will continue to shop at a brand if they have a great or yes, will continue to buy products from a brand that has a great loyalty program. So yes, loyalty programs are one of those very important features. And also it's important to note, it's also inherently privacy protective loyalty plan in the sense that they're not foisted on consumers without their choice. You have to opt in to a loyalty program.
You have to be delivered the deal and decide whether you want to do that or not. And the state laws recognize this as well. The only protections for loyalty plans are based on bona fide loyalty plans where a consumer has voluntarily opted in to participate in it. So I think there are ways that one of our principles is that businesses and consumers should be able to freely develop a business relationship. And if businesses on main street can develop those relationships, whether it's a very small business offering buy one get two free, or buy five cups of coffee, get the sixth one free, they should be able to have those kinds of relationships as long as they're privacy protective. And we think they are in terms of making sure they're voluntary.
And it's important to also note that the loyalty programs are subject in the state laws to every other requirement in the law. So whether it's a right to opt out or a right to delete, the consumers have those rights. So we think there are good business ways to do it. And loyalty is something that's been around in the retail industry for centuries. And we could go to general store examples and things from 1890, but the same thing applied back then that applies now.
Sen. Marsha Blackburn (R-TN):
Okay. Mr. Levine.
Sam Levine:
Thank you, Senator. It's one thing to join a loyalty program and say you can track my purchase history in exchange for getting coupons, fine. But what the FTC study showed is that these consultants are telling companies, look at what consumers are Googling. Look at what they're searching online. Look at their location. Look at how they're sorting products. A bunch of California law enforcers actually sued Target for increasing in-app prices while consumers were inside a Target store so they didn't know that they could pay lower prices when they're not at the store. Briefly, with respect to loyalty programs, again, I think if consumers voluntarily turn over information, that's fine. But what we saw in this Delta earnings airline call is what Delta said is we can stop matching prices because of our brand strength, because of our customer's loyalty. And in a world of surveillance pricing, my fear is that companies are going to prey on the consumers who are going to pay the most. You might say they're the most loyal, rather than giving them discounts. That's what we're already seeing in the airline industry.
Sen. Marsha Blackburn (R-TN):
All right. Mr. Martino, come back.
Paul Martino:
I'll keep it to a ten-second response.
Sen. Marsha Blackburn (R-TN):
Yeah.
Paul Martino:
What applies to Delta doesn't apply to Main Street businesses. You have to look at the size of the market, the competition in the market. Airline industry is notorious for being very few competitors, not millions of businesses across America.
Sen. Marsha Blackburn (R-TN):
Years ago as we were starting in on this debate, I would have people take out their keychain and look at their fobs that were on there, and those are programs they were choosing to share information with because of the incentive that would come back to them. Those times have changed.
I want to go to the issue of AI because we're looking at these AI models that are collecting more and more personal data. They are doing tracking, search history monitoring. And as we look at the prevalence of AI, and we've had a hearing on the No Fakes Act to protect name, image, likeness, and voice of individuals, and that in essence is a form of privacy.
But one of the questions that will come before us as we look at developing a federal privacy standard is how you hit that sweet spot of being strict enough to have that preemptive federal enforcement, but yet flexible enough to allow the innovation of new technologies that we see, things that are going to run on quantum rails, things that are going to be AI applications. So Ms. Goodloe, let me come to you on that and then I'd like, Mr. Thayer, for you to give me a response also.
Kate Goodloe:
This is such an important question and thank you for asking it. I think there are a couple of different ways to look at the need for a federal privacy law and its intersection with AI technologies. I think the first thing to look at is a recognition that AI can involve many different types of data. Some of that data may be personal, if an AI system is using personal data that relates to consumers, but a lot of the data used to train AI systems is not personal data. For example, AI systems may be trained to detect weather patterns based on data that's just about the weather and not about people.
But when it comes to AI systems that may be processing personal data, that's where a federal privacy law is very important to create the right set of safeguards so that consumers know their data will be handled responsibly and in trustworthy ways. One key issue is exactly what you pointed out, the need to make sure that a law is flexible enough to allow those products to continue to innovate over time. And I think this is one of the struggles that we've seen as the conversation about data minimization has evolved. That is such an important conversation, but you have to get it right because a standard needs to allow for technologies to get better over time. I expect all of the technology that I use today to be better next year and even better the year after that. And so it is important as you look at these protections to make sure they are flexible over time and to think through the uses that you want to apply to create the right set of safeguards.
Sen. Marsha Blackburn (R-TN):
Okay. Mr. Thayer?
Joel Thayer:
Thank you, Senator, and I think Kate put it very well. There is that nuance when it comes to AI, right? You do have that anonymized data, but there also is the question of what the consumer expected when they gave that data over. So I fail to remember exactly who said it, but it really is, data is the new oil. And what runs the AI machine is data. So the question is where are they getting it and how are they using it?
So I think at the front end, the consumer has to know how is this data going to be used? Is it going to be used to train an AI system? Are there elements of transparency in terms of how this data is going to be used down the road? That comes down to really being upfront with the consumer on where the data is going. And I think that's when it goes back to my testimony when I said that we just feel like it's out of control. We don't feel like we know exactly what happens when we put the data into any application or any use of search.
So the big part of this is going to be transparency and specifically what data these AI systems are training on. Are they training on PII? Are they training on anonymized data? Where are they pulling it? And, Senator, as you well know, there are ancillary issues like intellectual property that are also included into all of this as well. So the big question really comes down to, with respect to privacy, is what rights do citizens have when it comes to protecting their data on the front end so that way it's not used on the back end to do all the parade of horribles that we've already heard about today?
So that's how I see it in most cases. I think at the end of the day, the consumer has to know exactly what their data is going to be used and whether or not ... And also on the AI system, what are they training their data on?
Sen. Marsha Blackburn (R-TN):
So you're looking for specificity in that utilization?
Joel Thayer:
Specificity and, at the very least, being able to have the consumer be empowered to say I do not want my data to be used for X, Y, and Z. So it's both.
Sen. Marsha Blackburn (R-TN):
Yeah. The opt in, opt out.
Joel Thayer:
Yeah.
Sen. Marsha Blackburn (R-TN):
Yeah. All right. Do you have any other-
Sen. Amy Klobuchar (D-MN):
I'll just, I think Senator Schiff ... I thought maybe since we have a glass ceiling for only women asking questions here.
Sen. Marsha Blackburn (R-TN):
We kind of like that.
Sen. Amy Klobuchar (D-MN):
I mean, Hawley came by, Blumenthal, they've all had other hearings. They're great and been really helpful to us. Maybe I'll just ask two more and see if he can make it. So Mr. Thayer, in your written testimony, you referenced a European study that found that after the passage of JDPR, the General Data Protection Regulation ...
Sen. Marsha Blackburn (R-TN):
I'm going to have to jump in here because they need me to go to VA to vote.
Sen. Amy Klobuchar (D-MN):
That's where he is.
Sen. Marsha Blackburn (R-TN):
So that's where, yes.
Sen. Amy Klobuchar (D-MN):
That's where Blumenthal is.
Sen. Marsha Blackburn (R-TN):
That's where Senator Blumenthal is. So I will say my thank yous to you all in case I don't get back before this closes and remind you all that we're going to have questions for the record. As you can see, we have lots of questions and we are ever so grateful that you all have come before us. I'll go vote.
Sen. Amy Klobuchar (D-MN):
Okay, thank you very much and thank you again for putting together this hearing. So I was talking about the JDPR. We know we don't like everything that Europeans are doing on tech, but there are some good examples of some good things they've done. What about JDPR were big tech platforms able to take advantage of to entrench their position and how can we avoid doing the same in the US and how can we design data privacy standards that rein in abuses? What's the good things we can get out of that? I know there's things we could simply do here that they agreed to in Europe that we're still fighting out over here.
Joel Thayer:
It's a fantastic question and I think it really comes down to defining your goals. That was the first big issue. But in terms of what happened with the GDPR, and to be clear, there are elements of the GDPR that I think a lot of states have latched onto. Particularly Texas, where they pull this analytical framework between data controllers and data processors, being able to articulate exactly who has the responsibility is a big part of it.
Sen. Amy Klobuchar (D-MN):
I just wanted to have the record reflect that Texas used the European model, but keep going.
Joel Thayer:
I fell right into it. But I think where things went a little bit awry where there was this weird responsibility that the controllers basically had with respect to contractual regulation, I think it's Article 24 of the GDPR, where the controller basically has to dictate specifically, well, first, they have to make the assessment of whether or not the processor is even GDPR-compliant, and that gives the controller a lot of authority over what that smaller company most likely can do and can't do. I think that's one area we may want to stay away from.
But my overall point was that you need privacy and strong antitrust enforcement and competition enforcement. I think the two things go hand in hand. And so I think what Congress is currently looking at and I think is very important, is that it seems like you guys want to walk and chew gum, and which is why I very much appreciate, where you have these competition reform bills that are currently being discussed. You are a sponsor of that, Senator, which is the Open App Markets Act. I think that goes a long way in quelling some of those concerns. But one of the things I would caution against is creating an overly generalized authority and allowing the controller to have the pure mandate or at least the pure control to control the smaller companies are doing. I think that's one way you can avoid some of the pitfalls.
Sen. Amy Klobuchar (D-MN):
Okay. Last two questions, Mr. Martino, and they're related, and then I'll turn to Senator Schiff. We're very excited you're here. Yes, thank you. Mr. Martino, you can follow up on that, but could you talk about the challenges small businesses have operating across state lines quickly? Because I want to give Senator Schiff a chance here.
Paul Martino:
Yeah, certainly, Senator. First, let me just follow up real quickly. I wanted to add a point to what Mr. Thayer was saying. It's just that there are some things that are problematic in the GDPR and some of the expectations put on controllers envision a construct where the controller is the big company and they're getting these smaller processors to do what they want. And that's not what's developed here in the US, where you have very few, almost monopolistic big tech companies who are doing the vast majority of the processing consumers need, including transmission, including broadband and cable. And think about how a Main Street business might only have a choice of one broadband provider and imagine trying to negotiate that contract. I mean, they do the same as we do when we try to argue about a cable bill or a broadband bill. So we've all had that experience.
In terms of the multi-state operations, yeah, it's sort of a sense that, I know I put in my original testimony, many of us live in areas that are tri-state or multiple states are close by. There is travel across state lines, there's shopping, and then certainly online. If there's a boutique store in Minnesota that while you're here in Washington doing your job here, you want to make a purchase from there, you are engaging in interstate commerce. And so it's really important that ... and these privacy laws tend to be set up to apply to the location where the consumer is. So if you're in DC and you don't have a privacy law, are they complying with privacy law there? So what these small businesses need to do is they have to ... I mean, there's a de facto national standard because they have to comply with all these different states, but they're constantly changing, new laws are coming online. So Congress can do a really helpful job by passing a uniform national standard.
Sen. Amy Klobuchar (D-MN):
Yes. Last question here. Mr. Butler, you've advocated for a federal privacy law as well, but you want one that sets a floor. Obviously, this is all going to be political negotiations. But could you talk about why you would take that approach?
Alan Butler:
Sure. Thank you for the question, Senator Klobuchar. As Mr. Martino alluded, I think from the vast majority of businesses in this country, they just want to know what the rules are. And Congress's traditional role in privacy laws has been to set the baseline standard, but allow states to address new challenges and threats as they emerge. And that's been true, and I have the list here, I could rattle off the list of acronyms, but if you look at federal privacy statutes, by and large, they don't set a ceiling on the level of protection states can provide. But what's really essential here is for the federal Congress to step in and say, "Here's what the consistent standard is." And I think if they do that, then we'll have a consistent standard, companies will know what to comply with, and states still have the flexibility in the future to address new issues.
Sen. Amy Klobuchar (D-MN):
Thank you. Senator Schiff.
Sen. Adam Schiff (D-CA):
Thank you. Thank you for-
Sen. Amy Klobuchar (D-MN):
Filibuster, too.
Sen. Adam Schiff (D-CA):
I understand that you did, and I'm grateful for that and for all your leadership on this issue. Nearly a decade ago, California became the first state in the nation to adopt a comprehensive consumer privacy law, the California Consumer Privacy Act. This was shortly followed by the establishment of the California Privacy Protection Agency, which has served Californians for the last five years by implementing and enforcing the state's privacy laws. Other states have looked at California and our example and followed our lead, especially as new technologies have emerged, AI, facial recognition, algorithmic targeting, each posing more sophisticated threats to American's privacy. At the end of the day, California has proven you can be the fourth-largest economy in the world and be home to the most innovative technology companies on the planet and you can still protect consumers' fundamental right to privacy.
To this end, I'd like to enter into the record a letter from the California Privacy Protection Agency on the importance of a federal privacy law that creates robust baseline protections while allowing states like California to continue to adopt stronger protections and respond to the rapidly changing technologies being built in our own backyard. May that letter be entered in the record?
Sen. Amy Klobuchar (D-MN):
Of course it will, yes.
Sen. Adam Schiff (D-CA):
Yes. Thank you. Thank you. The horrific political assassinations last month targeting Minnesota lawmakers that I know ranking member Klobuchar has already referenced were aided in part by a data broker and website the shooter used to look up politicians' addresses. A recent investigation also revealed that a data broker owned and operated by at least nine major US airlines secretly sold Americans' information collected through flight records to US Customs and Border Protection and US Immigration and Customs Enforcement.
Starting on January 1st, 2026, 40 million Californians will be able to go to a single webpage hosted by the California Privacy Protection Agency and request that their data be deleted from over 500 data brokers if they choose. Federal legislation that preempts California's Delete Act without meaningful consideration of state level protections could mean that Californians will lose this touch of a button ability to know how their data is being used and have a voice in it.
Mr. Butler. Mr. Levine, how can a federal privacy law include better regulation of data brokers, including their registration in a central clearinghouse, and allow Americans to prevent their personal information from being sold to outside entities like we have done in California with the soon-to-be-implemented Delete Act?
Alan Butler:
Thank you, Senator Schiff, for the question. I think that California really has taken the lead here on tackling the problems of data brokers in this specific context. And I think both the requirements of registering, given that the average consumer has no way really to know what data brokers exist and who might have access to their information, and also providing a centralized mechanism to allow for deletion of data held by these entities, are really important protections, especially because this is a massive problem that requires scaled solutions, right? This isn't a situation where an individual consumer can be expected to go to every single one of hundreds or thousands of data brokers and submit individualized requests. So I think both of those are really important protections that have been developed in California.
Sen. Adam Schiff (D-CA):
Mr. Levine, am I pronouncing your name correctly?
Sam Levine:
You are. Thank you, Senator. I fully agree with Mr. Butler on the need for a floor rather than a ceiling consistent with other federal privacy laws. I'll make a quick point. I started my career at a state attorney general in the run-up to the financial crisis. It was state AGs desperately trying to stop subprime mortgages, the innovative products of the day, and it was federal banking regulators cheered on by big banks that were actively trying to stop them. So as I hear today big tech companies go around Washington saying we need to hit delete at all of these important state laws like the one you referenced, Senator, I recall that similar conversations two decades ago, and I recall well what happened in our country as a result.
Two quick points specifically on data brokers. The first is that we brought a series of enforcement actions under Chair Kahn at the FTC and what we require data brokers to do, we banned them from sharing sensitive location data and we prohibited them from building profiles of consumers based on sensitive geolocation data. I think that's a really important precedent. I think Congress also acted, I think in the last Congress, with, I'm going to get this wrong, Protecting American Data from Foreign Adversaries Act, PADFA, giving the FTC enforcement authority. I think it's regrettable that six months into this administration, we've not seen a single enforcement action. I hope that changes.
Sen. Adam Schiff (D-CA):
Madam Chair, do I have time for one more?
Sen. Amy Klobuchar (D-MN):
Oh, yes. Yes.
Sen. Adam Schiff (D-CA):
Okay. Thank you. Over the past few months, I've led a number of letters along with my colleagues to the Trump administration in response to alarming reports that various agency officials have ordered states to hand over the personal data of millions of Medicaid enrollees, as well as SNAP recipients and applicants to the Department of Homeland Security. These actions are remarkable departure from established federal privacy protections and should alarm everyone. I've demanded the administration reverse these actions, which likely violates several federal and state privacy laws, including the Privacy Act of 1974, HIPAA, and the Social Security Act.
Mr. Levine, what precedent does it set when federal agencies under the administration simply bypass established privacy laws that have protected Americans for decades and demand that states hand over their resident's most sensitive information with little or no explanation? And how does this compare to privacy protections in other democratic nations? Are we seeing the US now fall behind international standards for protecting citizens' data?
Sam Levine:
Thank you, Senator. I think we have the right standards here, at least with respect to government. It's not clear whether government officials are following them. And that makes me very worried. One of my consistent messages as an enforcer to big tech companies and to everyone is you need to follow privacy laws. And if you don't, there're going to be consequences. And when you have reports, and I've not verified them myself, but when you have reports of federal officials and federal agencies brazenly violating hard-won privacy protections around federal data, resulting in potential loss of healthcare, loss of jobs, loss of housing for Americans, I think that's deeply disturbing and it raises a real question of how Congress is going to pass a privacy law to bind the private sector when the federal government isn't following its own rules. So I completely share your concern and I hope to see changes in that from this administration.
Sen. Adam Schiff (D-CA):
And finally, if I could very quickly, Mr. Butler, you mentioned that there were a list of other privacy laws where Congress had set a floor, not a ceiling. Can you share a few of those with us?
Alan Butler:
Absolutely. And I'm happy to supplement the record with that as well. But just to note that basically every major federal privacy law sets either a floor or a conflict preemption standard, and that includes the Electronic Communications Privacy Act, the Right to Financial Privacy Act, the Cable Communications Privacy Act, the Video Privacy Protection Act, the Employee Polygraph Protection Act, the Telephone Consumer Protection Act, the Driver's Privacy Protection Act, Gramm-Leach-Bliley Act, and the Fair Credit Reporting Act. These are not ceiling preemptions. They don't limit states' abilities to adapt and evolve and protect their citizens more.
Sen. Adam Schiff (D-CA):
Well, thank you. Thank you, Ranking Member. Appreciate it.
Sen. Amy Klobuchar (D-MN):
Okay, very good. Well, thank you. And this is a lot of great testimony and answers. I just can't tell you how inspired I am from this work and Marsha's willingness to put this panel together, the good questions and just, I always think maybe we can do this, maybe we can actually get a privacy standard, and then I get excited and then it's hard.
But as this gets more and more important and with the advent of AI and just the patchwork, and maybe we can get some more incentives going to try to get to a better place on this, despite what everything would seem. And what gives me hope is just the people that are involved in this subcommittee, people we work with on commerce, and their ability to take risks in terms of what everyone wants them to do and try to find some common ground on this issue, which we have done several times. So I just want to thank all of you for the testimony and the hearing record will remain open for, yeah, we're making it up, for one week. And the hearing is adjourned. Thank you.
Authors
