Transcript: Senate Holds Hearing On AI Deepfakes and the NO FAKES Act
Cristiano Lima-Strong / May 22, 2025
Sen. Marsha Blackburn (R-TN) speaks during the Senate subcommittee hearing on Wednesday May 21, 2025. (X)
Senators used a hearing Wednesday to highlight legislation that would bar people from using artificial intelligence to make unauthorized copies of someone else’s voice or likeness.
The Senate Judiciary Committee session, “The Good, the Bad, and the Ugly: AI-Generated Deepfakes in 2025,” largely focused on efforts to pass the NO FAKES Act, S. 1367, a bipartisan bill led by Sens. Chris Coons (D-Del.) and Marsha Blackburn (R-Tenn.). The bill would give people a property right over AI-generated replicas of them and create a notice-and-takedown mechanism so they can request that deepfakes be removed from online platforms.
The measure was initially unveiled as a discussion draft in 2023 and formally reintroduced last month. Senators signaled Wednesday they will push to advance the bill out of committee.
Key moments included:
Senators warned that the proliferation of AI posed a major threat to both artists, whose work could be more easily ripped off with the tools, and children, who could be subjected to abuse through deepfakes. “The proliferation of these digital replicas created without the artists’ consent poses a real threat to their livelihoods and the livelihoods of all American artists and creators,” said Sen. Marsha Blackburn (R-Tenn.), whose subcommittee convened the hearing.
- Supporters of the NO FAKES Act sought to seize on the momentum of the TAKE IT DOWN Act, another measure dealing with AI forgeries that President Donald Trump signed into law earlier this week. “The NO FAKES ACT is a perfect complement to [TAKE IT DOWN] by preventing AI deepfakes that steal someone's voice or likeness and use them to harass, bully and defraud others,” said country music singer Martina McBride, who testified in support of the bill.
- Senators on both sides of the aisle voiced concern about a House proposal to stop states from regulating AI for ten years. Blackburn spoke out against overriding laws like Tennessee's ELVIS Act, which prohibits using AI to replicate artists’ voices without their consent. “Until we pass something that is federally preemptive, we can’t call for a moratorium on those things,” Blackburn said. Sen. Amy Klobuchar (D-Minn.) praised the remarks.
Below is a lightly edited transcript of the hearing, “The Good, the Bad, and the Ugly: AI-Generated Deepfakes in 2025.” Please refer to the official audio when quoting.
Sen. Marsha Blackburn (R-TN):
The Senate Judiciary Committee on Privacy, Technology and The Law will come to order. And I want to thank all of you for being here with us this afternoon. As you can see, everyone is curious about AI and today we are going to talk about AI and how this affects our content creators, our creative community, our children, and how it affects all Americans when it comes to the digital space and what happens with AI-generated deep. This hearing is titled “The Good, the Bad, and the Ugly,” and it is titled in that regard for a specific reason. Now in Tennessee, we talk about there is a lot of good that has come from AI when you're talking about logistics, advanced manufacturing, healthcare, cutting edge research, and we've even seen the amazing role that AI has played in giving voice to some Randy Travis to be specific who joined us here on the Hill recently in introducing the no fake spill, it gave Randy Travis the ability to share his talent with the world once again.
And despite some of these benefits, there are really some bad and unpleasant sides to AI. And specifically when it comes to AI-generated deep fakes, these deep fakes cause tremendous harm. And today we are going to examine those arms and the legislative solutions, including the No Fakes Act that Senators Coons, Klobuchar, Tillis and I have introduced. We've introduced them specifically to address these harms. First, these deep fakes pose significant harm to our content creators from Music Row to Bill Street back over to the Smoky Mountains in Upper East Tennessee. Tennesseeans have made their mark in the music world, and we've got one of those artists with us today. But the proliferation of these digital replicas created without the artist's consent pose a real threat to their livelihoods and the livelihoods of all American artists and creators. The No Fakes Act is a monumental step forward in protecting our creative community.
It provides landmark protection of the voice and visual likenesses of all individuals and creators from the spread of these digital replicas that are created without their consent. And I'm looking forward to speaking with our witnesses about this critical bill and how impactful it will be for the creative community. And I've got to be clear, our efforts must protect all Americans from the harms of DeepFakes and that includes our precious children. In recent years, we've seen a deeply troubling spike in the use of generative AI to create sexually explicit deepfake content. Just as concerning, NCMEC saw a, get this number, 1,325% increase from 2023 to 2024 in reports involving generative AI. We've got to do something about that. And both the No Fakes Act and the Take It Down Act, which President Trump just signed into law this week, go a long way to providing greater protections for our children from these deep fakes.
These deep fakes have also served as a powerful tool for fraud. In one example, scammers used AI-generated images and voices of a multinational firm CEO to steal millions of dollars. We've also seen celebrities, likenesses, misappropriated in false product endorsements. It's clear that Congress has to act, and that's why the three of us sitting right here on this dais have joined forces plus Senator Tillis who's going to get here in a little bit to work on the No Fakes Act and get it to President Trump's desk this year. We know that the creative community, all these content creators, our children and all Americans, deserve nothing less than our best efforts on this issue. And I turn to Senator Klobuchar for her opening statement.
Sen. Amy Klobuchar (D-MN):
Well, thank you very much, Senator Blackburn. I'm very excited about this subcommittee and the work we've already done together for years on this issue and similar issues when it comes to tech. I share your hopes for AI and see that we're on this cusp of amazing advancements if this is harnessed in the right way, but I'm also concerned if things go the wrong way. I think it was Jim Brooks, a columnist that said he has trouble writing about it because he doesn't know if it will take us to heaven or hell. So it's our job to head to heaven, and it's our job to put some rules in place, and this is certainly one of them. We want this to work for children, for consumers, for artists, and not against them. And you brought up the example chair of Randy Travis who was at the event that we recently had with you and Senator Coons and myself about the bill and how he used AI in such a positive way.
But then we know there are these risks, and one of the things that I think is really exciting about this week is that in fact, on Monday the president signed my bill with Senator Cruz, the Take It Down Act, into law. This was a bill I discussed with him and the first lady at the inaugural lunch. It's an example of using every moment you have to advance a cause. And then she supported the bill and helped to get it passed in the House that her, Cruz and I had already passed in the Senate, and we were having some trouble getting it done over in the House. So we're really pleased because it actually does set some track moving forward. Even though this bill, that bill is about non-consensual porn, both AI created and non-AI created, it's had huge harmful effects. About 20 some suicides a year of young kids who think they're sending a picture innocently to a girlfriend or a potential boyfriend, and then it gets sent out on their school internet.
It gets sent out to people they know and basically they believe their life is in ruins and don't have any other contexts and take their own lives. And that's just the most obvious and frightful part of this, but there's others as well. So I'm hoping this is going to be a first step to some of the work that we can do, including with the bill that we're going to be discussing today. So AI-enabled scams have become far too common. We know that it takes only a few seconds of audio to clone a voice. Criminals can pull the audio sample and personal backstory from public sources. Just last week, the FBI was forced to put out an alert about scams using AI clone voices of FBI agents and officials asking people for sensitive payment information. Jamie Lee Curtis was forced to make a public appeal to Mark Zuckerberg to take down an unauthorized deepfake ad that included her digital replica endorsing a dental product while meta removed the ad after her direct outreach.
Most people don't have that kind of influence. We also need rules of the road to ensure that AI technologies empower artists and creators and not undermine them. Art just doesn't entertain us. It's something that uplifts us and brings us together. When I recently met with Cory Wong, a Grammy nominated artist from Minnesota, he talked about how unauthorized digital replicas threaten artists' livelihoods and undermine their ability to create art. So this is not just a personal issue, it's also an economic issue. One of the reasons our country, one of our best exports to the world, is music and movies. When you look at the numbers and how we've been able to captivate people around the world, that's going to go away if people can just copy everything that we do. And one of the keys to our success as a nation and innovation has been the fact, and Senator Coons does a lot of work in this area.
We've been able to respect copyrights and patents and people owned right to their own products. So that's why this No Fakes Act is so important. It protects people from having their voice and likeness replicated using AI without their permission all within the framework of the Constitution. And it protects everybody because everyone should have a right to privacy. I also am working in the space on AI to put some base rules in place in my role on the commerce committee. Senator and I have a bill that we're reintroducing on this to set some rules for NIST to be able to put out there for companies that are using AI. And then I'm always concerned about its effect on democracy, but that is for a different day and in a different committee. But I do want to thank Senator Blackburn for her willingness to come out on doing something about tech, including the work she does with Senator Blumenthal, the work that we've done together on commerce. And if Monday is any sign with the first bill getting through and there in that Rose Garden signing ceremony, there's more to come. And so thank you and look forward to hearing from the witnesses.
Sen. Marsha Blackburn (R-TN):
Thank you, Senator Klobuchar. Senator Coons, you're recognized.
Sen. Chris Coons (D-DE):
Thank you so much. Chair Blackburn, ranking member Klobuchar. It is a delight to work with you and thank you for inviting me to give some brief opening remarks about the No Fakes bill because of you and Senator Tillis working on this together. Since 2023, we have made real progress. There is momentum with this bill. We've been adding co-sponsors. My thanks to Senators Durbin and Haggerty, Schiff and Cassidy. We're adding organizations that are endorsing it like YouTube and RAINN, and as we saw at the White House on Monday, if there's bipartisan agreement in Congress and support from the White House, that action is needed. We can make progress in complex, challenging technical areas. This hearing's a chance to look critically at the current state of the no figs bill. So we can both build on that momentum and answer the questions, what did we get right?
What do we need to tweak? How can we get more co-sponsors and push to a full committee markup? So I'm excited to hear from our witnesses today. There are two other committee hearings going on right now, which is why you will see senators come in and out, not a lack of interest. Yes. When we were drafting this bill, its applicability to pillars of the creative community like Ms. McBride, Martina McBride, or to a movie star like Tom Hanks. Its applicability to people who make a living off of their voice or likeness was clear, but Senator Blackburn and I agreed at the outset the rules we were drafting should apply to everyone. Everyone should have the power to control their digital replica online, not just those who are superstars. So I appreciate Chair Blackburn, the witnesses you brought together today. Speak to the full scope of what this bill can do to keep the public safe from scams. Just like the Bill Senator Klobuchar just got signed into law and helped wipe non-consensual deep fake pornography off the internet. Second, the revised draft we introduced last month was the product of stakeholders negotiating in good faith. Ms. Carlos, you and YouTube came to the table with the intention of getting to yes and we got there and if Google can get behind this bill can handle the obligations that No Fakes imposes, so can the other tech platforms. Thank you. I look forward to hearing from you and returning to question.
Sen. Marsha Blackburn (R-TN):
Thank you Senator Coons. I would like to introduce our witnesses. Martina McBride is a Nashville bass singer-songwriter who has sold more than 23 million albums worldwide with six singles hitting number one on the country music chart. In addition to her 14 Grammy Award nominations. Ms. McBride is a four time country music association, female vocalist of the year, a three time Academy of Country Music top female vocalist and a member of the Grand Ole Opry. She first signed to RCA records in 1991 and has since been awarded 14 Gold Records, 12 Platinum Honors, three double Platinum records and two triple Platinum awards. Mitch Glaser is the CEO and Chairman of the Recording Industry Association of America. We use the acronym RIAA. He helps to represent the rights and interests of over 1,600 member labels. Prior to joining RIAA, Mr. Glaser served as Chief counsel for Intellectual Property to the US House of Representatives Judiciary Committee, as well as numerous other roles in and around government, including as a commercial litigation associate.
He earned his bachelor's degree from Northwestern University and is JD from Vanderbilt School of Law. Our next witness is Christen Price. Ms. Price serves as senior legal counsel for the National Center for Sexual Exploitation, NCOSE, correct? And she works to combat all forms of sexual exploitation and advocate for justice for survivors of sex trafficking, child sexual abuse, pornography, and prostitution. Before her work at NCOSE, Ms. Price served as legal counsel at the Alliance Defending Freedom where she specialized in First Amendment law and conscious protections. Ms. Price earned her bachelor's degree from Cedarville University and her JD from Georgetown University Law Center and Mr. Justin Brookman. Mr. Brookman is the Director of Technology Policy for Consumer Reports where he specializes in data privacy and security issues. Before joining Consumer Reports, he was policy director of the Federal Trade Commission Office of Technology Research and Investigation. Earlier in his career, he served as chief of the Internet Bureau of the New York Attorney General's office.
He earned his bachelor's degree from University of Virginia and is JD from New York University School of Law and Ms. Suzana Carlos, who serves as head of music policy at YouTube until 2022, she served as senior council for YouTube's music publishing and in senior positions at the American Society of Composers, authors and publishers we like to call it ASCAP Universal Music Group and EMI Publishing. She is also on the board of Digital Media Association, which represents the leading global audio streaming companies and promotes legal access and engagement of music content between creators and users. Ms. Carlos earned her bachelor's at the University of California, Los Angeles and her JD from Fordham University School of Law. Welcome to each of you. At this time I want to ask you all to rise and raise your right hands. Do you affirm that the testimony that you're going to give to this committee is the truth, the whole truth, and nothing but the truth, so help you God and let the record reflect that everyone is in the affirmative. We'll begin with our testimony. Ms. McBride, you are recognized for five minutes and welcome. Thank you.
Martina McBride:
Chairman Blackburn, ranking Senator Klobuchar, Senator Coons and members of the subcommittee. Thank you for inviting me to speak about S. 1367, the No Fakes Act of 2025, a landmark effort to protect human voices and likenesses from being cloned by artificial intelligence without consent. I'm so grateful for the care that went into this effort and I want to thank you and your colleagues for making this issue a priority. I started singing when I was four years old and my voice is at the center of my art form. Each of my recordings includes a piece of me that is individual and unique songs reflect the human experience and I'm honored that they're a part of people's lives. From wedding vows to breakups, to celebrating milestones and even the special relationship between a mother and daughter. But today, my voice and likeness along with so many others are at risk.
AI technology is amazing and can be used for so many wonderful purposes, but like all great technologies, it can also be abused in this case by stealing people's voices and likenesses to scare and defraud families, manipulate the images of young girls in ways that are unconscionable, impersonate government officials or make phony recordings posing as artists like me. It's frightening and it's wrong. Congress just took a very important step forward to deal with sexual explicit deep fake images by passing the Take It Down act. I want to thank all the leaders including Senators Cruz, Klobuchar, Blackburn, and many on this committee who worked hard with others to push that bill into law. The No Fakes Act is a perfect complement to that effort by preventing AI deepfakes that steal someone's voice or likeness and use them to harass, bully and defraud others or to damage their career reputation or values.
The No Fakes Act would give each of us the ability to say when and how AI DeepFakes of our voices and likenesses can be used. If someone doesn't ask before posting a harmful deepfake, we can have it removed without jumping through unnecessary hoops or going to court. It gives every person the power to say yes or no about how their most personal human attributes are used. It supports AI technology by providing a roadmap for how these powerful tools can be developed in the right way and it doesn't stand in the way of protected uses like news parodies or criticism. I want to thank the technology companies like OpenAI and Google who support this bill as well as the legions of creators who have worked so hard to advocate for it and the child protection and anti-sex trafficking and exploitation groups who support it and continue to fight for those who are most vulnerable. In my career, it's been a special honor to record songs that shine a light on the battles that many women fight, especially the terrible battle of domestic violence. Many fans have told me that the song Independence Day has given them strength and in some cases the song has been the catalyst that has made them realize that they need to leave an abusive situation. Imagine the harm that AI and AI deep fake could do. Breaching that trust using my voice in songs that belittle or justify abuse.
One of the things I'm most proud of in my career is I've tried to conduct myself with integrity and authenticity and the thought that my voice could be deep faked or my likeness could be deep faked to go against everything that I've built, go against my character is just terrifying, and I'm pleading with you to give me the tools to stop that kind of betrayal. Setting Erica on the right course to develop the world's best AI while preserving the sacred qualities that make our country so special. Authenticity, integrity, humanity, and our endlessly inspiring spirit. That's what the No Fix Act will help to accomplish. I urge you to pass the bill now. Thank you.
Sen. Marsha Blackburn (R-TN):
We thank you. Mr. Glazier. You're recognized for five minutes.
Mitch Glazier:
Thank you so much. Thank you for having me. I'm honored to testify today alongside the groundbreaking artist Martina McBride, who just spoke so eloquently about the value of someone's voice, the value of their image, and the threats posed by abuses of deepfake technology. I'd also like to recognize the almost 400 artists and performers and actors who have just signed a statement in support of the No Fakes Act with some very simple words. It's your voice, your face, your image, your identity. Protect your individuality. That's why we're here. That's what this is all about. Artists' voices and likenesses are fundamental to their work, credibility, expression, careers. In many ways, these deeply personal, highly valuable attributes are the foundations of the entire music ecosystem and unauthorized exploitation of them. Using DeepFakes does cause devastating harm. We have to prevent that harm. So my deepest thanks and the thanks of a very grateful music community go out to all of you, the Chairman Blackburn, to ranking member Klobuchar to Senator Coons and to all of the senators, Senator Tillis, Haggerty, Durbin, Cassidy Schiff, and I hope many more on this committee and throughout the Senate for introducing and the supporting the No Fakes Act.
You did it after months actually years of work with each other, stakeholders, your counterparts in the house, you've been able to build bipartisan, bicameral, broad-based consensus around legislation that will protect not just artists, but all victims of deep fake abuses, including child exploitation and voice clone scams, which we'll hear about from the other witnesses. Today you've shaped a common sense bill that has won the support of AI. Companies like Google, who's here today, OpenAI, IBM as well as broadcasters, motion picture studios, child protection groups, free market groups, labor unions, and virtually the entire creative community that's hard to do. The No Fakes Act provides balanced yet effective protections for all Americans while supporting free speech, reducing litigation, and promoting the development of AI technology. It empowers individuals to have unlawful DeepFakes removed from UGC platforms as soon as it can be done without requiring anyone to hire lawyers or go to court in those situations.
It contains clear exemptions for uses typically protected by the First Amendment, such as parody, news, reporting, and critical commentary, and it encourages AI development and innovation targeting only malicious applications and setting the stage for the legitimate licensing of rights with real and meaningful consent. No Fakes is the perfect next step to build on after the Take It Down act. It provides a civil remedy to victims of invasive harm that go beyond the criminal posting of intimate images addressed by that legislation and protects artists like Martina from non-consensual DeepFakes and voice clones that breach the trust she's built with millions of fans. American music is the most valuable music in the world. We lead in investment exports and market power. Music drives the success of other important American industries, including the technology industry through thriving partnerships. If we signal to the rest of the world that it's acceptable to steal American's voices and likenesses, we have the most to lose.
Our voices and our music are the most popular and will be taken the most destabilizing. The music economy, our intellectual property system, our national identity, and the very humanity of the individuals who bless us with their genius. The No Fakes Act is a critical step in setting America up as an example, and to continue and extend its global leadership in innovation and creativity. It shows that we can boost AI development while preserving every individual's autonomy, all individual liberties, and protect our constitutional property rights at the same time. We are really proud to support this legislation and we vow to help you pass it into law this year. Thank you again.
Sen. Marsha Blackburn (R-TN):
We thank you, Ms. Price. You're recognized for five minutes.
Christen Price:
Chair Blackburn, Ranking Member Klobuchar. Thank you for holding this hearing and addressing this truly urgent matter. My name is Christen Price, senior legal counsel at the National Center on Sexual Exploitation, a nonpartisan nonprofit dedicated to eradicating all forms of sexual exploitation by exposing the links between them. Our law center represents survivors in lawsuits against those who perpetrate enable and profit from sex trafficking, including pornography companies, contemporary pornography, depicts and normalizes violence including asphyxiation, electrocution, and rape. This is pervasive. The top four sites, PornHub X, videos X, hamster and X and XX had nearly 60 billion total visits in 2024. One woman's husband sexually assaulted her while she was sleeping and put the video on X videos which were tagged, sleeping pills. PornHub hosts child sexual abuse material and sex trafficking content with their employees admitting that traffickers use their sights with impunity, forged, or deepfake.
Pornography uses AI that is trained on this kind of abusive content, merging it with the faces of other women and girls. A 2023 report found that deep fake pornography increased by 464% between 2022 and 2023. The top 10 deep fake pornography sites had 300 million video views in 2023. 98% of all deep fake videos are pornography related, and 99% of those who are targeted are women. The perpetrators are disproportionately male. One survey found that 74% of deep fake pornography users don't feel guilty about it. A high schooler discovered a boy she'd never met, took a photo off of her on Instagram, created an AI deep fake and circulated it through Snapchat. Two years later, she still hasn't been able to remove all the images. A woman whose close family friend made deep fake pornography of hers said My only crime was existing online and sharing photos on platforms like Instagram.
The person who did this was not a stranger. I was not hacked, and my social media has never been public. These are serious human rights abuses, violating the person whose face is depicted and the person whose body is shown. Survivors report fear, isolation, shame, powerlessness, suicidal thoughts, doxing harassment from sex buyers and difficulty attending school, maintaining jobs, and participating in public life. This is a form of sexual exploitation from which it is impossible to fully exit. There is a very old idea that to protect more privileged women from male violence, society needs an underclass of women that men can violate with impunity. This was always a morally inexcusable premise and the rise of forged pornography shows that it is also a lie. Deep fake technology allows any man to turn any woman into his pornography. These are impossible conditions for equality. As Andrea doin stated in 1986, the civil impact of pornography on women is staggering.
It keeps us socially silent, socially compliant. It keeps us afraid in neighborhoods, and it creates a vast hopelessness for women. A vast despair, one lives inside a nightmare of sexual abuse that is both actual and potential, and you have the great joy of knowing that your nightmare is someone else's freedom and someone else's fun. The harms are severe and irreversible, so deterrence is essential. This is why NCOSE supported the bipartisan effort to pass the Take It Down Act, which the president signed into law on Monday and requires online platforms to remove non-consensual content. Within 48 hours of being notified, NCOSE strongly supports three additional bills that complement Take it Down, the No Fakes Act, the Kids' Online Safety Act and the Defiance Act. These bills help protect individuals from the harmful effects of image-based sexual abuse and increased pressure on tech companies to manage websites more responsibly. Finally, NCOSE is concerned about the recent AI state moratorium language included in the House budget Reconciliation Bill as it creates a disincentive for AI companies to put safety first. Technological progress should not come at the expense of human dignity. It is our collective responsibility to protect the voice, face and likeness of every person from unauthorized use. Thank you.
Sen. Marsha Blackburn (R-TN):
We thank you, and I will note for the record that we're submitting your full testimony into the record with all of your footnotes. I really appreciate that. Thank you so much, Mr. Brookman. You're recognized for five minutes.
Justin Brookman:
Thank you. Chairwoman Blackburn and Ranking Member Klobuchar. Thank you very much for the opportunity to get to testify here today. I'm here on behalf of Consumer Reports where I head up our work on tech policy advocacy. We're the world's largest independent testing organization. We use our ratings, our journalism, our surveys, our advocacy to try to create a more fair, healthier and safer world. Gratified, the committee is focusing on the problems created by audio and video DeepFakes, which for better or worse are getting more realistic and convincing every day. They're used in romance scams and grandparent scams where a relative gets a frantic call from a distressed family member who's in immediate need of cash. As the chairwoman noted, they're used in fake testimonial videos from celebrities hawking everything from meme coins to cookware. I believe Elon Musk is one of the most frequently impersonated celebrities online as Ms. Price testified eloquently.
Obviously, one of the most prevalent uses is for the creation of non-consensual intimate images and videos, and they're increasingly used to propagate misinformation, certainly in the political realm, but also in the more petty personal realm. There's a story in Maryland recently about a degreed teacher who created deep fake audio of his boss saying racist and antisemitic slurs. As this last example shows, realistic cloning tools are easily available to the public and very cheap and easy to use. Earlier this year, consumer reports conducted a study of six of voice cloning tools. They're easy to find online to see how easy it would be to create fake audio based on a public recording like a YouTube clip. Our study found that four of the six companies we looked at didn't employ any technical mechanism or a reasonable technical mechanism to reasonably ensure they had the consent of the person whose voice was being cloned.
Instead, the customer just had a click like, yes, I have the person's consent. Two require the person to read a script, so to help indicate the person was on board with having their voice cloned. Four, the companies also did not collect much identifying information from customers, just a name or an email address to start creating deep fake voice clones. Given how likely abuse these services are, I don't think they were doing enough and a lot of our members agree. We recently got 55,000 signatures on a recent petition asking the Federal Trade Commission and State Attorneys General to investigate whether these services were in violation of existing consumer protection laws. And that brings me to solutions. So one thing, we need strong consumer protection agencies who have the resources to crack down on abuse of emerging technologies. Last year, the FTC brought a handful of AI cases.
It's part of Operation AI Comply, but they don't have the capacity right now to confront the massive wave of scams and abuses online tools and responsibilities. I think some of these AI power tools are designed such that they're almost always going to be used for ill purposes, whether it's deep freight, pornographic image generators, or voice impersonation, developers of these tools need to have heightened obligations to try to forestall all harmful uses. If they can't do that, then maybe they should not be freely available to the public. Platforms too need to be doing more to proactively get harmful material off their platforms. It's a very difficult job. It takes resources, but it absolutely needs to be done. Transparency, people deserve to know whether the content they're seeing online is real or fake. I know there have been a number of bills introduced in this Congress to try to address that.
Also, a law recently passed in California to start to put transparency obligations on entities that make deepfake content stronger privacy and security laws. As this committee knows very well, the United States generally has fairly weak legal protections. As the ranking member noted, the ready availability information about us online makes it easier for scammers to target us with scams. We've seen a ton of progress at the state level on privacy and security laws, but they're not strong enough whistleblower protections and incentives. In many cases, we only find out about abuses inside these tech companies when someone comes forward with their story. I was glad to see bipartisan legislation introduced on this issue protecting AI whistleblowers in the last week. Education. I don't want to put all the burden on consumers, but the reality is this is the world we live in. We need to teach people to look out for these sorts of scams.
We are part of a campaign called Pause Take Nine, which tries to train people that if they get an urgent call to action, they should pause. Take nine seconds, think about if this is real or not. And finally, I want to echo the words of Ms. Price about a lot of discussion about a moratorium on state laws, policing bad uses of AI. I want to stress that this is the wrong idea. AI has tremendous amazing potential, but as this hearing shows, it has some real potential harms as well. The states have been leaders in trying to address these harms, whether it's privacy, co-opting performers, identities, regulating self-driving cars, rooting out high hidden biases, and other deep fakes. AI is an incredibly powerful technology, but that does not mean it should be completely unregulated. Thank you very much and I look forward to answering your questions.
Sen. Marsha Blackburn (R-TN):
And Ms. Carlos, you're recognized for five minutes.
Suzana Carlos:
Chairwoman Blackburn, ranking member Klobuchar and members of the subcommittee, thank you for the opportunity to speak with you today on the important topic of the No Fakes Act and AI generated digital replicas. My name is Suzana Carlos and I serve as the head of music policy for YouTube. Just last month, YouTube marked the 20th anniversary of the first video ever uploaded to our platform. It's difficult to fathom how much the world and YouTube have changed in those two short decades. Today we have over 2 billion active monthly members on our platform across more than a hundred countries with 500 hours of content uploaded every minute. We are proud that YouTube has transformed culture through video and built a thriving creator economy here in the United States and around the world. Our unique and industry leading revenue sharing model empowers our creators to take 55% of the revenue earned against ads on their content.
And as a result, YouTube's creative economy has contributed more than 55 billion to the United States gross domestic product and supported more than 490,000 full-time American jobs in the last year alone. In the three years prior to January, 2024, YouTube paid more than $70 billion to creators, artists, and media companies. At YouTube Music, we built one of the world's deepest catalogs over a hundred million official tracks plus remixes, live performances covers and hard to find music you simply can't find anywhere else. We've now reached over 125 million paid YouTube and music and premium subscribers, and YouTube continues to be at the forefront of handling rights management at scale, protecting the intellectual property of creators and our content partners, ensuring that they can monetize their content and keeping YouTube free for viewers around the world. In 2007, YouTube launched content id, a first of its kind copyright management system that helps rights holders effectively manage their works.
Rights holders are their agents who provide YouTube with reference files for their works they own, along with metadata such as title and detailed ownership rights. And based on these references, YouTube creates digital fingerprints for those works in question and scans the platform to determine when content in an uploaded video matches the reference content. W rights holders can instruct the system to block, monetize, or track the reference content, and over 99% of the copyright issues on YouTube are handled through content id. It has also proven to be an effective revenue generation tool for rights holders as over 90% of content ID claims are monetized. And as we navigate the evolving world of AI, we understand the importance of collaborating with partners to tackle emerging challenges proactively. We firmly believe that AI can and will supercharge human creativity not replace it. Indeed, AI has a potential to amplify and augment human creativity, unlocking new opportunities for artists, creators, journalists, musicians, and consumers to engage creatively with new tools and play an active role in innovation.
We are already seeing creators exploring new areas, including the creation of new music, books, photography, clothing, pottery games, and other art inspired in collaboration with AI models. And as this technology evolves, we must collectively ensure that it is used responsibly, including when it comes to protecting our creators and viewers. Platforms have a responsibility to address the challenges posed by AI-generated content, and Google and YouTube stand ready to apply our expertise to help tackle them on our services and across the digital ecosystem. We know that a practical regulatory framework addressing digital replicas is critical, and that is why we are especially grateful to Chairwoman Blackburn, Senator Coons, ranking member Klobuchar and all the bill sponsors for the smart and thoughtful approach adopted in developing the No Fakes Act of 2025. We deeply appreciate their willingness to bring a variety of stakeholders together to forge a consensus on this important topic.
YouTube and Google are proud to support this legislation which tackles the problems of harm associated with unauthorized digital replicas and provides a clear legal framework to address these challenges and protect individuals' rights. The No Fakes Act appropriately balances innovation, creative expression, and individual's rights while offering a broadly workable tech neutral and comprehensive legal solution by supplanting the need for a patchwork of inconsistent legal frameworks, the No Fakes Act with streamlined global operations for platforms like ours and empower artists and rights holders to better manage their likeness online. We look forward to seeing the legislation passed by Congress and enacted into law. We have similarly proudly supported the Take It Down Act because it's critical to prevent bad actors from producing and disseminating non-consensual explicit images. We would like to thank Ranking Member Klobuchar along with Senator Cruz for their leadership on the legislation. This is an area we continue to invest in at Google building our longstanding policies and protections to ultimately keep people safe online. Thank you again for inviting me to participate in today's hearing. I look forward to your questions.
Sen. Marsha Blackburn (R-TN):
And we thank you all for sticking to the five-minute clock. I didn't have to gavel down a person. These are great content creators, me. So there we go. I'm going to recognize myself for five minutes for questions, and as Senator Coons said earlier, there are going to be members coming and going because we do have a variety of hearings that are going on. Ms. McBride, I want to come to you first. I think that your perspective is such an important perspective as we talk about this and talk about the direct impact to someone who is creating content. And I appreciated so much that in your testimony you talked about how your voice and likeness along with so many other creators, that that is at risk and therefore your livelihood is at risk. So talk a little bit about how harmful deep fakes are in the long term and why it is important to get legislation like this to the President's desk and then talk about fellow artists that you have spoken with and their concerns on the issue.
Martina McBride:
Well, as you said, it does affect livelihood for musicians, backup singers, voiceover actors, authors like so many people in the arts. For me, being established and having done this for over 30 years, that's not necessarily my first concern. I have the luxury of that not being my first concern, but it is a concern for younger artists that are coming up. So as I said in my testimony, the thing that I'm most concerned with personally is how we work so hard to present ourselves with integrity and a certain character, and the fact that that could be distorted or manipulated to be the opposite of what I stand for and what I believe or to be used to cause harm to someone through endorsing a harmful product or far into the future after I'm gone, somebody creating a piece of music or me saying something that I never did and it just kind of disintegrating.
What I've worked so hard to establish was just trust with my fans, with people who when I say something, they believe it. I think for younger artists, to your point of livelihood, to be new and having to set up what you stand for and who you are as a person, as an artist, what you endorse, what you believe in, and establishing a trust with your fans. And then on top of it, having to navigate these waters of someone coming in and distorting all of that is devastating. I don't know how, I can't stress enough how it can impact the careers of up and coming artists and even just in their ability to speak their truth or just to live in fear of being a victim of these deep fakes.
Sen. Marsha Blackburn (R-TN):
Mr. Glazier, I want to come to you on something you mentioned about the critical balance of protecting the artist's voices and likenesses and then also reducing litigation. And that is why we need to have this framework and I think helping artists stay out of court, I mean, they're at a point where they may have to spend much of what they've earned in order to protect themselves and to protect their brand, if you will. If you'll elaborate on that.
Mitch Glazier:
Sure. I'm happy to. The bill has to be effective and practical at the same time, both for the victim and for the platform who is going to limit the damage to the victim? It has to work on both ends, and that's why I think the approach that was taken both in the Take It Down act and in this act are so important, especially in areas where the platform has less knowledge and less control because end users, end users are posting on the platform and those can go viral very, very quickly. The ability for the platform to take it down as soon as technically and practically feasible so that they stop the damage and to keep it down so that the artist or any other victim doesn't have to spend their lives monitoring a platform and continually sending more notices and more notices as end users keep putting up the same material over and over and over again, we now have tools that will allow the removal off of the platform.
And once the removal is done, the damage can be limited. There is no liability for the platform, and the artist doesn't have to spend their time just litigating where there is more knowledge and control where the platform has an employee upload it, for example, then there should be responsibility on the platform. And those are cases where you might need to go to court because the platform could have prevented it and they didn't prevent it. So I think the bill is incredibly balanced and really innovative in its approach to protecting free speech, reducing litigation, but also effectively protecting the right that's necessary.
Sen. Amy Klobuchar (D-MN):
All right. Thank you very much. I guess I'll start with Mr. Brookman, the non-Grammy winner, and I want to talk to you just a little bit about this consumer angle here, which I think's interesting to people. And I think at its core, all of us involved in this legislation have made it really clear that's not just people who are well-known that will be hurt by this eventually, and that getting this bill passed as soon as possible is just as important for everyone. But I do so appreciate Ms. McBride's being willing to come forward because those stories and the stories that we've heard from, like I mentioned Jamie Lee Curtis, are the stories that we've heard from many celebrities are very important to getting this done. So you just did a report, AI-generated voice cloning scams, including that AI voice cloning applications in the words of the report presents a clear opportunity for scammers, and we need to make sure our consumer protection enforcers are prepared to respond to the growing threat of these scams.
I had this happen to my state director's husband who their kid is in the Marines and they got a call, they figured out that it wasn't really him asking for stuff and money, they knew he couldn't call from where he was deployed to, but this is just going to be happening all over the place and the next call will be to a grandma who thinks it real, and she sends her life savings in. So I have called on the FTC and the FCC to step up their efforts to prevent these voice cloning scams. What are some of the tools that agencies need to crack down on these scams even outside of this bill?
Justin Brookman:
Yeah, absolutely. So I think the first thing the federal trade condition probably needs is more resources. They only have like 1200 people right now for the entire economy. That's down from like a hundred just in the past couple of months.
Sen. Amy Klobuchar (D-MN):
Down from way down from even during the Nixon years.
Justin Brookman:
Yeah, 1700 it used to be. And the economy has grown three or four times. Chairman Ferguson has said more cuts are coming, which I think is the wrong direction. I worked at the Federal Trade Commission for a couple of years. We could not do a fraction of all the things that we wanted to do to protect consumers. So more people, more capacity, more technologists. There's just not enough technology capacity in government. I was in the Office of Technology Research and investigation there that was like five people. That's just not enough, obviously, with all these very sophisticated, I mean, just DeepFakes alone, let alone the rest of the tech economy, the ability to get penalties and even injunctive relief. If someone gets caught stealing something, the FTC often doesn't have the ability to make them give the money back. I know this committee has tried to restore that authority, but that would be important. And also, again, maybe FTC could have rulemaking authority, but also I would like to see Congress consider legislative authority to address tools like again, if you are offering a tool that can be used only for harm, voice impersonation, deep fake pornographic images, maybe there should be responsibilities to make sure it's not being used for harm.
Sen. Amy Klobuchar (D-MN):
Okay. Thank you. Ms. Carlos, could you talk about what YouTube is doing to ensure it's not facilitating these scams?
Suzana Carlos:
Sure. And thank you for the question, Senator.
Sen. Amy Klobuchar (D-MN):
And thanks for your support for the bill.
Suzana Carlos:
Of course. So just to primarily consider, we obviously see great tremendous opportunity coming from AI, but we also acknowledge that there are risks and it is our utmost responsibility to ensure that it is deployed responsibly. So we've taken a number of efforts to protect against unharmful contact on our platform. Primarily, we have uploaded, we have updated our privacy policies last year to ensure that all individuals can now submit a notice to YouTube when their unauthorized voice or likeness has been used on our platform and once reviewed, if it is applicable and we've confirmed that that content should be removed, we will take it down. We've additionally implemented watermarks on our AI products. We originally began with both image and watermarks using our synth ID technology, and we've recently expanded it to also be applied to text generated from our Gemini app and web experience. And most recently, as part of our VO video tool, we've also taken the additional step to become a member of C2PA, the Coalition for Content Provenance and Authenticity. And there we're serving as a steering member to work with the organization to create indicators and markings that will allow the content provenance that was created off platforms to additionally be recognized. And we're deploying those technologies across our platform.
Sen. Amy Klobuchar (D-MN):
Okay. Thank you. We've mentioned the Take It Down Act, and thank you for the support for that. Mr. Glazier, you talked about how this is the first federal law related to generative AI and that it's a good first step. And could you talk about how, if we don't move on from there and we just stop and don't do anything for years, which seems to be what's been going on, what's going to happen here and why it's so important to do this?
Mitch Glazier:
I think there's a very small window and an unusual window for Congress to get ahead of what is happening before it becomes irreparable. The Take It Down Act was an incredible model. It was done for criminal activity.
Sen. Amy Klobuchar (D-MN):
I know.
Mitch Glazier:
Right? You know, you wrote it. But it was a great model, but it only goes so far. But we need to use that model now and we need to expand it carefully in a balanced way to lots of other situations, which is exactly what the No Fakes Act does. And I think we have a very limited amount of time in order to allow people and platforms to act before this gets to a point where it's so far out of the barn that instead of encouraging responsible AI development, instead we allow investment and capital to go into AI development that hurts. So let's encourage investment the right way to boost great AI development and be first, let's not be the folks that encourage investment in AI technologies that really harm us.
Sen. Amy Klobuchar (D-MN):
And Ms. Price, you've expressed concerns about this 10-year moratorium on state rules. I'm very concerned, having spent years trying to pass some of these things. And I think that one of the ways we passed things quickly, like MR was talking about, as if people actually see a reason that they don't want a patchwork, they want to get it done. But if you just put a moratorium and you look at the Elvis law coming out of Tennessee, Ms. McBride and some of the other things that would stop all of that, could you, my last question here before we go to another round, could you talk about why you're concerned about what is right in front of us now, which is this 10 year moratorium?
Christen Price:
Yes. Thank you for the question, Senator. We are concerned about the moratorium because it's basically signaling to the AI companies that they can do whatever they want in the meantime, and it inhibits states’ ability to adapt their laws to this form of technology that's changing very quickly, and then has this potential to cause great harm.
Sen. Amy Klobuchar (D-MN):
Thank you
.Sen. Marsha Blackburn (R-TN):
And I know Senator Coons is on his way and Senator Hawley is coming back, but Ms. Price staying with you, you talked about the Take It Down act and the importance there, but touch on the gap that No Fakes fields for a child who may have something posted, but yet it doesn't fit under Take It Down and how this would open up an avenue of recourse for them.
Christen Price:
Yes. Thank you, Senator. So under the No Fakes Act, because there is a private rite of action, there would be another way essentially for a victim to seek accountability from a perpetrator or platform, which is really important because the layers of accountability are what really deter bad actors from engaging in harm. So having the criminal, but then also having the ability to do the private right of action, the civil action is important.
Sen. Marsha Blackburn (R-TN):
And speaking to the states and their actions, I do want to mention that Tennessee passed the Elvis Act, which is like our first generation of the No Fakes Act. And we certainly know that in Tennessee we need those protections. And until we pass something that is federally preemptive, we can't call for a moratorium on those things.
Sen. Amy Klobuchar (D-MN):
Excellent statement.
Sen. Marsha Blackburn (R-TN):
Of course. Of course. Ms. Carlos, I want to talk with you for just a minute. And we are grateful for the support that you all have talked about, and there's a provision in the bill that I know is important to your platform and many others, and that's the notification piece and giving individuals harm. You've talked about artists being able to contact you, but for you all to be able to notify and letting people know about this and then asking for that content to come down and then taking that action is we have worked on the Kids Online Safety Act. One of the complaints that had come to Senator Blumenthal and I from individuals that tried to get things off was they could not get a response. So this is something that the notification is an imperative. So talk a little bit about how you're approaching notification.
Suzana Carlos:
Thank you. Thank you for the question. Yeah. So in looking at the framework of No Fakes, again, we began with a voluntary framework on YouTube, which allows individuals to notify us when digital replica content of them is online. And this is smartly mirrored in the No Fakes Act. It empowers a user to identify content and flag it to us when they believe it should be removed for an unauthorized use of their voice or likeness. And as you mentioned, that notification is critical because it signals to us when the difference between content that is authorized and harmful fakes, and it's with that notice that we are able to review content and make an informed decision as to whether or not it should be removed.
Sen. Marsha Blackburn (R-TN):
And then what is your length of time for getting it down upon receiving notification? What is your process going to be on implementation?
Suzana Carlos:
Sure. So as a similar framework, we envision as under the DMCA, where a web form would be easily available for any user quickly filled out, and then submitted to our trust and safety team. We make every effort to review every notice on a case by case by case basis and remove it as soon as possible.
Sen. Marsha Blackburn (R-TN):
So are you talking hours, days? What is your framework?
Suzana Carlos:
I don't have the exact number on the top of my head, but I do know that we try to process every notification as quickly as possible.
Sen. Marsha Blackburn (R-TN):
Thank you. If you will, check on that. Sure. And then get that information back to us. I think we would like to know that because the fact that this has taken such lengths of time for people to have any kind of response has been very difficult for consumers, and they feel like they're talking to the outer space and nobody is listening and nobody's responding.
Suzana Carlos:
Thank you for flagging the concern. I'd be happy to follow up with you on the committee.
Sen. Marsha Blackburn (R-TN):
I appreciate that. Senator Coons, you're recognized for five minutes.
Sen. Chris Coons (D-DE):
Thank you so much. Madam Chair, I'd like to first thank Ms. McBride for being here to testify in support of No Fakes. Could you speak to why this bill is so important both to protect artists like you and to protect your fans? Thank you.
Martina McBride:
Thank you. I think that it's important because as artists, we hopefully want to speak the truth. We want to build a relationship with our fans in which they trust us, so when they believe what we say. So when you have something like a deep fake that either sells a product or says a statement, it can be so harmful to that trust. I mean, I just realized sitting here that I bought a product, a collagen supplement off of Instagram the other day because it had Leanne Rhimes and a couple of other people. And I'm sitting here thinking, oh my goodness, I don't even know if that was really them. Right? So it's damaging to the artist and to the fan. We had a situation personally where one of my fans believed they were talking to me ended up selling their house and funneling the money to someone who they thought was me. That is so devastating to me to realize that somebody who trusts me could be duped like that. And then also I think that eventually somebody who is duped by a deep fake is going to be angry enough to have retribution, which we're on stages in front of thousands of people. We're in public places. So it's a danger to the artist as well.
Sen. Chris Coons (D-DE):
Mr. Glazier, to follow up on Ms. McBride's testimony, what do you think are the consequences for the music industry if we don't get No Fakes over the finish line? What will the consequences be for music fans and for the industry?
Mitch Glazier:
The entire music ecosystem is dependent on the authentic voice and the authentic image of the artist. That is what the music industry is. If you allow deep fakes to perpetuate, you're taking the soul out of the art. And when you do that, you're taking the humanity out of the art, and that's what art is. So I think it's fairly existential that the voice of music be the voice of music. I think that's what everything is built on, and the idea, it's almost bizarre that we have to sit here today talking about allowing someone to protect the use of themselves. If there's anything that we have a right in and should be able to control, it's the gifts that God gave us, the voices that we have, the image that we have, and for that to be taken from you is devastating, both for the individual and obviously for the industry itself, which is built on these very voices.
Sen. Chris Coons (D-DE):
Ms. Carlos, if I might, I just want to thank you for YouTube's partnership in getting to the place where you support No Fakes; other tech companies haven't come forward. I'd be interested in what you might say or encourage me to say to the Mets or TikTok's of the world about why they should support this bill, even though it imposes new obligations on them. And some have argued that No Fakes might chill legitimate speech by incentivizing platforms to over-remove content out of fear of being sued. How does YouTube think about balancing its obligations under this bill with its First Amendment obligations to users?
Suzana Carlos:
Thank you for the question, Senator. As we mentioned, YouTube largely supports this bill because we see the incredible opportunity of AI, but we also recognize those harms, and we believe that AI needs to be deployed responsibly. I believe Mr. Glaser mentioned during his opening statement that the No Fakes Act does carry First Amendment exemptions, parody, satire, newsworthiness, and that is one of the reasons that we felt comfortable endorsing this bill. We are, at the end of the day, an open platform, and we believe that a variety of viewpoints can succeed on YouTube. So those would be some of the things that I would share with you to share with those other companies, but I cannot speak directly on behalf of why they may or may not choose to endorse the bill.
Sen. Chris Coons (D-DE):
Understood. Thank you all and thank you all for your testimony today. Thank you.
Sen. Marsha Blackburn (R-TN):
Senator Hawley, you're recognized for five minutes.
Sen. Josh Hawley (R-MO):
Thank you. Very much Madam Chair, thanks to all of the witnesses for being here. Ms. Carlos, if I could just start with you. You're here on behalf YouTube, is that right?
Suzana Carlos:
That is correct.
Sen. Josh Hawley (R-MO):
Can you tell me why is it that YouTube has monetized videos that teach people how to generate pornographic DeepFakes of women? Why does that happen on your platform?
Suzana Carlos:
Thank you for the question. Protecting our users is one of our top priorities. My general expertise is in music policy, so I'm not in the best position to answer that question, but I'm happy to follow up with you.
Sen. Josh Hawley (R-MO):
Do you know how many such videos there are out there that are, these are monetized videos now on YouTube?
Suzana Carlos:
I'm not aware of that number. I can't say that our community policies do not allow that type of content on our platform.
Sen. Josh Hawley (R-MO):
Well, Forbes Magazine just reported that YouTube's in fact promoted over a hundred YouTube videos with millions of views that showcase AI, deepfake porn, and include tutorials on how to make deep fake porn, particularly porn that targets young women. Do you have any idea how much money YouTube has made off of this monetization?
Suzana Carlos:
Thank you for bringing this to my attention. I do not have details on this specific news article. I'm happy to follow up with you in the committee.
Sen. Josh Hawley (R-MO):
So you don't have any idea of how many ad dollars YouTube has made off of this?
Suzana Carlos:
I do not.
Sen. Josh Hawley (R-MO):
Are you aware that one of these websites that was promoted by YouTube in these videos was later cited in a criminal prosecution for AI sexual abuse material generating, let me be more specific, generating AI sexual abuse material involving children.
Suzana Carlos:
Thank you again for the question, Senator. As we mentioned earlier, YouTube has endorsed the Take It Down Act and we take these issues very seriously. Again, I'll notify that I represent music policy and do not have the information to give you a fulsome response during today's hearing.
Sen. Josh Hawley (R-MO):
Let me ask you this then. If a teenage girl's face ends up in an AI porn video on your platform, what does YouTube do about it? What's her recourse right now? What can she do to get some recompense, get some restitution?
Suzana Carlos:
After? Over a year ago, we updated our privacy policy so that anybody who believes that their voice or likeness is being used without their authorization on our platform can submit a request for removal.
Sen. Josh Hawley (R-MO):
A request for removal. Is there some policy in getting reimbursement for any profits the company may have made? Again, if these videos are monetized, I mean, does the victim get a share of anything?
Suzana Carlos:
I'm not aware of those policies. I would have to follow up with you, Senator.
Sen. Josh Hawley (R-MO):
Why is it that the enforcement of YouTube's own policy here seems to only happen after videos go viral? Is there a reason for that?
Suzana Carlos:
I do not have the answer to that question for you.
Sen. Josh Hawley (R-MO):
Do you know how many AI generated deepfake videos or deepfake content is removed before a victim complains? Does the victim have to complain before YouTube does anything?
Suzana Carlos:
Again, my specialty is in music policy. I do understand that we use technology such as AI to search for that content, and when it is in violation of our policies, we will remove it.
Sen. Josh Hawley (R-MO):
Let me ask you about this YouTube training data. Has YouTube provided data for use in Google's, Gemini or other AI training programs?
Suzana Carlos:
YouTube does provide data in Google training data in accordance with our agreements.
Sen. Josh Hawley (R-MO):
So if an artist uploads music to YouTube, does the company use that music to train AI models?
Suzana Carlos:
As I mentioned, we do share data in accordance with our agreements. I can't speak to the specifics of any individual agreement.
Sen. Josh Hawley (R-MO):
Well, so how are people like Ms. McBride protected? I mean, so if you're an artist and you put any content on YouTube, does that mean that it's just free range? I mean, you can do whatever you want with it.
Suzana Carlos:
Again, it goes down to the terms of our agreement. I will say that we have forged deep partnerships with the music industry. We came out of the gate with forming AI music principles with the music industry and are continuing to experiment with them to see how AI can best benefit their creative process.
Sen. Josh Hawley (R-MO):
So are their privacy protections, you're telling me YouTube has in place privacy protections for artists?
Suzana Carlos:
They apply to all individuals on our platform.
Sen. Josh Hawley (R-MO):
So this is the click wrap scenario. This is in order to watch cute dog videos or whatever, you've got to click the I consent and that wraps in. You basically give consent for your stuff to be used.
Suzana Carlos:
There are all different types of various agreements, but our terms of service are included in that batch.
Sen. Josh Hawley (R-MO):
So I guess my question is where are users told about their privacy protections, if they have any, and where did they explicitly consent?
Suzana Carlos:
They agree to our terms of service, and we also have our privacy policy available on the web.
Sen. Josh Hawley (R-MO):
Okay, so that's the click wrap. So in other words, if you come onto YouTube, you want to use it, you click, you got to click through. So you click it, and there you basically agreed to allow YouTube to give your content to AI and allow them to train it without any further consent. Is that basically it?
Suzana Carlos:
Again, we implement our policies in terms of our agreement are what goes into our training.
Sen. Josh Hawley (R-MO):
Well, and I'm asking you the content of that agreement. So in other words, if I'm an artist and I upload something to YouTube and yeah, sure, I've clicked the button that says, yeah, I want to be able to use YouTube. Are you telling me that I don't have any further recourse? If YouTube then goes and gives the information to AI models and systems, there's nothing further I can do. Or am I missing something?
Suzana Carlos:
It is. If it is in accordance with our agreements, we will share that data.
Sen. Josh Hawley (R-MO):
Yeah, that seems like a big problem to me. That seems like a huge, huge problem to me, and the fact that YouTube is monetizing these kinds of videos seems like a huge, huge problem to me. I am glad you're here today. I wish there were more tech companies here today, but we've got to do more. I mean, YouTube, I'm sure is making billions of dollars off of this. The people who are losing are the artists and the creators and the teenagers whose lives are upended. We've got to give individuals powerful, enforceable rights in their images, in their property, in their lives, back again, or this is just never going to stop. Thank you, Madam Chair.
Sen. Marsha Blackburn (R-TN):
Thank you. And that is the reason we have the No Fakes bill, and we are trying to push it across the finish line. I would like to offer a second round. Senator Klobuchar, do you have additional questions?
Sen. Amy Klobuchar (D-MN):
Very short. I know that Senator Coons has asked some of my questions about just people's personal experience with this. I guess I'd ask you, Mr. Glazier, I'm not sure you were asked about this. Do you agree that using copyrighted materials to create copycat content undermines the value of the music created by artists and could show creation of new art?
Mitch Glazier:
Absolutely. If you are able to copy copyrighted material for any purpose without consent, you're basically allowing the person who's copying to make the money and to do with it what they want, but not the creator who's supposed to actually control it and who made it to be compensated for it and to control its exploitation. It's the very opposite of what the Constitution calls for in creating intellectual property.
Sen. Amy Klobuchar (D-MN):
Very good. One last question. Sure. Last one on the consumer education issue that was raised. Thank you. I'm sure you all care about it, but Mr. Brookman, so why we should not place the burden solely on consumers to protect themselves from AI scams? I don't think that's going to work very well. What steps should Congress take to help educate consumers when it comes to AI literacy and the like? I think it's something we could have some agreement on.
Justin Brookman:
Yeah, I mean, I think a public spending the money for a public awareness campaign is, I think, a really good idea. I think people hear stories of friends of friends who it's happened to, but a lot of people just have no idea that the things they see online, the things they see on Facebook are just not real.
Sen. Amy Klobuchar (D-MN):
One that says, I'm the fourth richest woman in the world now?
Justin Brookman:
Congratulations.
Sen. Amy Klobuchar (D-MN):
Yeah, that just this week, I'm sorry. I don't want to exaggerate, America. And then people try to defend me by sending out the list of the top 10 richest with like Oprah, and I always think it's kind of sad that I'm nowhere near it. But yes, that's the latest thing that's there. Go on.
Justin Brookman:
Yeah, training people to be aware of it, to think about it, just to watch out for social engineering attacks, false calls for urgency. A deep fake voice right now is usually good for a little while, but it's getting better and it is going to continue to get better. So one idea is having a family safe word, a word that only you and your family know that the scammer can't get, but they have access to a lot of personal data about us. So we are all vulnerable. The numbers are going up dramatically. So just like teaching people, like I said, it's a shame we have to teach people to do this, but it is the world we live in. Okay.
Sen. Amy Klobuchar (D-MN):
Thank you. Thank you, Senator Blackburn.
Sen. Chris Coons (D-DE):
Ms. Price. I was glad to see President Trump sign the Take It Down Act earlier this week. Why is No Fakes still necessary if Take It Down is on the books?
Christen Price:
Thank you, Senator. No Fakes is still necessary because it provides a way for victims to bring a civil lawsuit on their own behalf. And so there's an importance to having, yes, on the one hand, the criminal piece, the criminal law accountability and the required takedown under the FTC, but then of course, the victims being able to bring their own lawsuit if they wish to do that, it's more effective for deterrence to have multiple things.
Sen. Chris Coons (D-DE):
Ms. Carlos, why did YouTube come to the table? You could have just made it a whole lot harder for the bill to move forward if you didn't make concessions and agree to be a part of advocating for the bill.
Suzana Carlos:
Thank you for including us. Apologies. Thank you for the question, and thank you for including us in that round of stakeholders. So YouTube sits in a very unique kind of universe. We not only have our users and music partners and media partners, but we also have creators. And that is one area where this idea of digital replicas can cause real world harm. So in addition to supporting No Fakes, which gives them the individual right to remove content, not just from YouTube but from other platforms, we're continuing to invest in new technology, which we refer to as Likeness id, which will allow our participating members and our pilot to have their face and voice scanned and we'll be able to match across our platform. So we're continuing to invest in this technology as we see it's a top issue.
Sen. Chris Coons (D-DE):
Thank you. Thank you very much. Senator Blackburn, Ranking Member Klobuchar.
Sen. Marsha Blackburn (R-TN):
Thank you. Senator Coons. Mr. Glazier, I want you to touch on contracts. We've had quite a discussion this week on copyright and as artists negotiate these contracts for their name, their image, their likeness recently SAG-AFTRA made a move in some of their negotiations on this, but talk a little bit about the importance of having a federal standard as it relates to standard contract law.
Mitch Glazier:
Yeah, this goes to the very essence of consent for the artist. And so not only does the No Fakes Act give control and consent to the individual about the use of their voice and the use of their likeness, it also imposes some guardrails around the length of those contracts, what those contracts mean when the person is alive, what that person means after the person passes. And it also has special provisions that protect minors who might enter into contracts. That includes parents and guardians and also court authority. So it does a very good job of preventing abuse while giving the power to the individual whose voice is at stake and whose image is at stake in being able to license it.
Sen. Marsha Blackburn (R-TN):
Thank you so much, Ms. Price. I want you to submit to us, you can do this in writing when we look at the physical world in the statutes that exist for protecting individuals from some of the harms that you've listed today and the importance of Take It Down and the importance of No Fakes, but we don't have all of those criminal statutes that transfer to the virtual space. And I would like for you to give me a summary of your thoughts on that. Your testimony is expansive and helpful, and as I said, we've submitted that whole testimony. Mr. Brookman, we've submitted your entire testimony also, and we thank you for that, but I would like to have just a little bit more from you on that issue of those protections. As we've talked about, No Fakes and and KOSA, we talk often about this difference and you touched on it, and I'd like to have something more expansive. With that, we have no further members present and no further questions. I will remind you all that members have five days to submit questions for the record, and then you're going to have 10 days to return those answers to us. I thank you all. Our witnesses have been wonderful today. We appreciate your testimony for the record. And with that, the hearing is adjourned.
Authors
