Home

Transcript: US Senate Judiciary Subcommittee Hearing on "The NO FAKES ACT"

Prithvi Iyer / May 1, 2024

Tahliah Debrett Barnett (aka FKA Twigs) testifies at a Senate Judiciary Subcommittee hearing on April 30th, 2024, at the Dirksen Senate Office Building in Washington, DC.

On Tuesday, April 30th, 2024, the United States Senate Judiciary Committee’s Subcommittee on Intellectual Property held a hearing titled “The NO FAKES Act: Protecting Americans from Unauthorized Digital Replicas.” Witnesses included:

  • Lisa P. Ramsey: Professor of Law, University of San Diego School of Law (written testimony)
  • Graham Davies: President and Chief Executive Officer of the Digital Media Association (written testimony)
  • Ben Sheffner: Senior Vice President and Associate General Counsel for Law and Policy at the Motion Picture Association (MPA) (written testimony)
  • Duncan Crabtree-Ireland: National Executive Director and Chief Negotiator for the Screen Actors Guild-American Federation of Television and Radio Artists (SAGAFTRA) (written testimony)
  • Robert Kyncl: Chief Executive Officer, Warner Music Group (written testimony)
  • Tahliah Debrett Barnett (aka FKA Twigs): Singer, songwriter, dancer, and actor based in London, UK (written testimony).

What follows is a lightly edited transcript.

Sen. Chris Coons (D-DE):

The Senate Judiciary Committee will come to order. I'd like to thank all of our witnesses for participating today and my colleagues for joining me. I'd like to specifically thank ranking member Tillis and his staff for working on a consensus basis to put this hearing together and to thank Senator Blackburn and her staff as well for partnering with us on this hearing. About ten months ago, Senator Tillis and I held a subcommittee hearing on artificial intelligence and copyright law and their intersection, and I opened that hearing with the debut of a new AI-generated song, AI AI, a riff on Frank Sinatra's New York, New York with lyrics created by ChatGPT and voice cloning technology used to mimic Frank Sinatra's voice. The song was fun to create with permission from the rights holders, of course, but it also highlighted some pressing legal questions that generative AI tools raised. Was my song protected speech?

If I hadn't gotten permission, would the song have violated Mr. Sinatra's rights to his voice or his style? Since that hearing, AI-generated replicas have only grown more pervasive, from deepfake videos of celebrities hawking products to songs made with voice cloning tools posing as legitimate hits to scam calls mimicking a panicked grandchild's voice. AI-generated replicas of Tom Hanks and Gail King were used to advertise medical services, and a fake version of Elon Musk encouraged consumers to invest in a cryptocurrency scam. Drake, Eminem, Ariana Grande, Taylor Swift, and Beyonce are just a few of the musical artists who've seen their voices mimicked by AI clones, and McAfee, our global leader in online protection found one in four American adults have experienced an AI voice scam with three quarters having lost money. Scammers using AI-generated replicas of a grandchild's voice to trick a grandparent out of money have become so sophisticated.

Both the FCC and FTC have issued warnings. Now, these examples all relate to commercial speech, but AI deepfakes don't stop there. We've seen other examples, non-consensual explicit deepfake images and videos that are addressed in the Defiance Act that Senators Durbin and Graham have introduced, election interference addressed in the Protect Elections from Deceptive AI Act that Senators Klobuchar, Holly Collins, and I have introduced. Bluntly, these issues aren't theoretical. Deepfake pornographic images of Taylor Swift circulated broadly on X, formerly known as Twitter, before they were taken down. A voice clone of President Biden encouraged voters to stay home during the New Hampshire primary, and in Slovakia, a deepfake likely had an impact on the outcome of a national election. In summary, as AI tools have become increasingly sophisticated, it becomes easier to replicate and distribute fake images of someone, fakes of their voice, fakes of their likeness without consent.

We can't let this challenge go unanswered, and inaction should not be an option. As President Biden cautioned during a State of the Union, we must regulate AI and voice impersonation but do so thoughtfully, striking the right balance between defending individual rights and fostering AI innovation and creativity. Both Congress and the administration have been working to strike that balance; a bipartisan group of senators, Young, Heinrich, Rounds, and Schumer, convened nine AI forums last year, and Senator Schumer has encouraged committees to work on AI legislation on a bipartisan basis just as we're doing today.

That's why I was excited to release the NO FAKES Act discussion draft last October with Senators Tillis, Blackburn, and Klobuchar. This bill would protect people from having their images, voices, or likenesses used to create digital replicas that say or do things they never agreed to or would never say. The Bill accomplishes this broad goal in two ways: by holding individuals and companies liable if they produce an unauthorized digital replica of an individual's voice image or likeness and by holding platforms liable if they host or distribute an unauthorized digital replica if the platform knows the person depicted did not authorize it.

Unlike current Right of Publicity laws that many states have enacted, which often are focused on celebrities who monetize their likeness and leave ordinary people without a remedy, NO FAKES Act protections would apply to all individuals regardless of whether they commercialize their voices, images, or likenesses. Our bill tries to be careful to balance these protections against free speech rights. The First Amendment will, of course, apply to this bill whether we say it does or not, but we made clear, long-recognized carve-outs, like, for example, parody and satire, remain available to creators to continue to foster the artistic and innovative potential of AI. Over the past six months, we have had literally dozens of meetings and received extensive feedback, hundreds if not thousands of proposed revisions, tweaks, edits, and wholesale changes on the discussion draft from stakeholders who loved the draft, who hated the draft, and everyone in between. That was exactly the point, and I appreciate the many constructive suggestions we've received. That's also the point of having a hearing today with folks who support the bill, who question the bill, who oppose the bill, and to have a real dialogue. Let me close. The feedback has centered around a few five different sorts of core technical areas. Whether we should include a notice and take down structure similar to the DMCA, whether we've struck the right balance with First Amendment exclusions, whether a 70-year postmortem term should be adjusted or narrowed, whether our bill should have preemptive impact over similar state laws, whether the bill should create some process by which individuals with limited resources and minimal damages can enforce their rights under the law. So, I look forward to continuing this work with my colleagues and immediately following this hearing to work to promptly formalize the NO FAKES ACT for introduction next month with Senator Blackburn and Senator Tillis. They're great cooperation. We've assembled a wonderful panel today with diverse perspectives. I encourage you, as our witnesses, to tell us what you like about the draft, what you dislike about the draft and be specific about what changes you would like us to consider and why. I'll introduce the witness panel in the moment, but let me next invite Senator Tillis to make his opening remarks. Thank you.

Sen. Thom Tillis (R-NC):

Chairman Coons. As you were going through the description of the NO FAKES Act, I was thinking about; we've done a lot. I love this subcommittee because we actually do work here, and we actually have a bunch of IP nerds or other interested parties showing up. We have a lot of people interested in this space, but I really think the NO FAKES Act is unique among the other bills that we've carried forward in terms of intellectual property because it touches everybody. Normally, it's about patent holders or creators, and this touches everybody, every socioeconomic strata. It's interesting, but it's also one of the reasons why we've got to get it right. We've got to make sure that we come up with concrete solutions. We don't want to overreach. There is a need for legislation, so anyone who's in the don't fix it, it ain't broke category. I respectfully disagree, but I'd be fascinated to hear your testimony if we have witnesses who are in that position. But we also don't want to miss the opportunity, or we don't want to stifle opportunities for innovation. That's why it's so important to get it right.

We don't even know what AI is going to look like ten years from now. Interestingly enough, AI is going to make AI even more sophisticated over a much shorter period of time, so that's got to be instructive to our policy formulation. But we've all seen, as Chairman Coons has indicated, replicas, deepfakes, photos, videos, audio. We're going to show you an example here shortly, and the number is just growing, so we have to work on it and we have to do the fair things. Entertainers, politicians, and the public at large have been subject to really fake media for much of the last hundred years, but now it's getting serious, and it's producing and multiplying at a rate that requires Congressional action.

I think I want to go forward a little bit in my comments because I think Senator Coons did a good job of describing some of the challenges and some of the things we want to work on with our bill, but I'd like to I think if Steph is prepared, I want to show you a video to give you a recent example. This was interesting. I use AI every morning as a part of my news feed, so I interact with generative AI for about an hour every morning, have for about two years since ChatGPT first released their beta version of OpenAI ChatGPT, and it was a week or so ago that I saw the estate of Tupac questioning a recent production. I said more to come. Let's study this more. And we thought it was probably interesting for maybe folks who aren't following the issue as closely as us to show the video. We got staff ready to queue that up.

Tupac or AI? (Voice illustration):

Kendrick, we need you, the West Coast savior me Up. Hey Grady, your name and some hip hop history, and to deal with this viciously, you seem a little nervous about all the publicity, Canadian Light-skinned doc, we need to know the baby West Coast Victory, man, call him me. Talk about him liking girls. For me heard on the podcast has got to be true. They told me to spare the,

Sen. Thom Tillis (R-NC):

So that entire musical rendition is a product of AI, and interestingly, that image, one of those that name image and likeness, is something that is the property of Tupac's estate. The other one is an AI-generated image that was obviously done in violation to the extent it was used for commercial purposes in violation of a copyright. So it just gives you an example, a real worl-d, this is a hypothetical. This happened beginning a week or so ago, shortly after Drake released that song. So we've got work to do and legislation addressing the misuse of digital replicas will have a multi-billion dollar implication. We've got to get it under control. Now, there are a lot of questions that have to be asked, and my office, in particular, is guilty of putting drafts out there knowing that they're drafts. Sometimes we even do it sooner without the cooperation or involvement of other members because we put scary stuff out there to give you a ghost of Christmas future.

In this case, we didn't do that. We tried to work on putting out a discussion draft that makes sense, but we have a lot of things that we have to work out the questions that we need answered. Why is the mandate that individuals have no right to license out of their digital likeness unless they retain counsel? Should we create an exception for harmless, non-commercial users? Should there be a notice and take-down provision? I mean, there's a lot. There's a litany. I'm not going to go through all of them. I hope that you all can come up with other ones, but I'll submit the rest of my written statement for the record, but we have to act. Hopefully, in this Congress, we can act, which means we have to move very, very quickly or, at a minimum, lay down a baseline that we can pick up when we come back with a new Congress and get it right.

So I look forward in advance to everybody's active collaboration. I will always give you the same warning I give everyone. The only thing that really makes me mad is when I see somebody trying to, through Gorilla Warfare, undermine the good faith efforts of this committee and my colleague. If you're at the table, you can have an influence. If you're not at the table, you're going to be on the table. So why doesn't everybody just recognize our office is open to constructive criticism. Use cases of where the policy doesn't make sense, but if you're in the category of it ain't broke, don't fix it. You're not up with modern times, and I look forward to a good, productive hearing today and thank you in advance for your productive collaboration as this legislation moves forward.

Sen. Chris Coons (D-DE):

Thank you, Senator Tillis, and thank you for another positive and engaging hearing. It's been a great experience serving on this committee with you. Today, we welcome six witnesses to testify about the NO FAKES Act. Our first witness is Robert Kyncl, CEO of Warner Music Group, who has a lot of experience; he spent over a decade as YouTube's Chief Business Officer, among a number of other business engagements. Next, we have Twigs; thank you, Twigs, for joining us today, a singer, songwriter, producer, dancer, and actor who has used AI to help her create and also has had personal experience with unauthorized AI deepfakes. It's great to have a voice present from the creative community. Next, we have the Senior Vice President and Associate General Counsel for Law and Policy at the Motion Picture Association, where he specializes in copyright and the First Amendment. We have Duncan Crabtree-Ireland, National Executive Director and Chief negotiator for SAG AFTRA, the Screen Actors Guild, American Federation of Television and Radio Artists, a labor union representing 160,000 members who work in film, television, music, and more, also the voice of the creative community.

Then we have Ben Sheffner, Senior Vice President and Associate General Counsel for Law and Policy at the Motion Picture Association, where he specializes in copyright and First Amendment law. Thank you, Ben. We welcome Graham Davies, President and CEO of the Digital Media Association, an organization that represents principally audio streaming companies, and platforms like Spotify and YouTube. Mr. Davies also has a history as a musician and songwriter advocate. Finally, we'll hear from Lisa P. Ramsey, professor of law at the University of San Diego School of Law, where she teaches and writes on the intersection of free speech rights and intellectual property law. After I swear in the witnesses, each will have five minutes to provide a summary of your opening statement. Those senators have your written statements. Then, we'll proceed to questioning. Each Senator gets five minutes for the first round. We will likely have two or even three rounds of questioning, time and attendance permitting. Witnesses, would you please stand and raise your right hand to be sworn in? Do you swear or affirm that the testimony you're about to give before this committee is the truth, the whole truth, and nothing but the truth, so help you God?

Thank you all. Let the record reflect the witnesses have been sworn in. Mr. Kyncl, you may proceed with your opening statement.

Robert Kyncl:

Chairman Coons, Ranking Member Tillis, and members of the subcommittee, I'm Robert Kyncl, Chief Executive Officer of the Warner Music Group. Being here today is something I could not have imagined as a young boy growing up behind the Iron Curtain in communist Czechoslovakia. In 1992, I crossed the Atlantic and attended state university in New York, and there I met an amazing woman from the Dominican Republic who eventually became my wife, and now we have two amazing American daughters. I'm a proud US citizen, and I have a deep appreciation for the freedoms at the heart of this great country, having grown up without them. For the past 25 years, I've been a tech and media executive. I joined Warner Music last year after 12 years at YouTube and eight years at Netflix. Warner Music is home to an incredible array of artists and songwriters who are moving culture across the globe. One of those artists, FKA Twigs, is here with me today and she's an extraordinarily gifted singer, songwriter, actor, and performer. I would also like to acknowledge and thank Duncan Crabtree-Ireland, who led the recent collective bargaining agreement negotiations between SAG AFTRA and record labels that address concerns regarding AI and defend artists' rights.

Music has so often been the canary in the coal mine for the broader trends in our society. More than any other form of communication or entertainment, music drives culture and innovation, and that's happening again with generative AI. Today, music companies are helping artists, rights holders, and tech companies figure out this new world, which is both exciting and daunting. It's our job not only to help amplify artist creativity but to protect their rights, their livelihoods, and their identities. Across the industry, legends from Roberta Flack to the Beatles have embraced AI as a tool to enhance their creativity. At the same time, generative AI is appropriating artists' identities and producing deepfakes that depict people doing, saying, or singing things that have never happened before. My accent is a vestige of my Eastern European upbringing. You can hear my voice, my identity in my voice.

Through AI, it is very easy for someone to impersonate me and cause all manner of havoc. They could speak to an artist in a way that could destroy our relationship. They could say untrue things about our publicly traded company to the media that would damage our business. Unfettered deepfake technology has the potential to impact everyone. Even all of you, your identities could be appropriated and used to mislead your constituents. The truth is everyone is vulnerable. Families defrauded by voice clones pretending to be relatives, people placed in pornography without their consent, and school children having their faces inserted into humiliating scenes. Some people have spoken of responsible AI as a threat to freedom of speech, but it's precisely the opposite. AI can put words in your mouth, and AI can make you say things you didn't say or don't believe. That's not freedom of speech. We appreciate the efforts of this committee to address this problem, including the NO FAKES Act discussion draft authored by Chairman Coons, Ranking Member Tillis, Senator Blackburn, and Senator Klobuchar. Your leadership kickstarted efforts in this area, and we strongly support the bipartisan No AI Fraud Act introduced in the House earlier this year by Representative Salazar and Representative Dean and the recently enacted Elvis Act in Tennessee.

As the committee moves toward the introduction of a Senate bill, there are three elements the bill should contain to be effective. One; an enforceable intellectual property right for likeness and voice. Each person should be allowed to license or deny that right on free-market terms and seek redress for unauthorized users. Two, respect for important First Amendment principles without going any further and providing loopholes that create more victims, and three, effective deterrents. To incentivize a vibrant and responsible commercial marketplace, we need to maintain consequences for AI model builders and digital platforms that knowingly violate the person's property rights. I applaud the committee for its leadership in addressing these challenging and rapidly developing issues with urgency. Congress should pass legislation this year before the Genie is out of the bottle while we still have a chance to get this right; I look forward to answering your questions. Thank you.

Sen. Chris Coons (D-DE):

Thank you, Mr. Kyncl. Twigs.

Tahliah Debrett Barnett (“FKA twigs”):

As artists, we dedicate a lifetime of hard work and sacrifice in the pursuit of excellence, not only in the expectation of achieving commercial success and critical acclaim but also in the hope of creating a body of work and reputation that is our legacy. So why am I here today? I'm here because my music, my dancing, my acting, the way my body moves in front of the camera, and the way that my voice resonates through a microphone is not by chance. They are essential reflections of who I am. My art is a canvas on which I paint my identity and the sustaining foundation of my livelihood. It is the very essence of my being. Yet this is under threat. AI cannot replicate the depth of my life journey, yet those who control it hold the power to mimic the likeness of my art, replicate it, and falsely claim my identity and intellectual property. This prospect threatens to rewrite and unravel the fabric of my very existence. We must enact regulations now to safeguard our authenticity and protect against misappropriation of our inalienable rights. Three decades ago, we did not realize that the internet would embed itself so deeply into the core of our everyday lives. Policies and controls to keep pace with the emergence of technology were not put in place to protect artists, young people and those that are vulnerable, and it ran away with us.

AI is the biggest technological advancement since the internet. You know the saying: Fool me once, shame on you; fool me twice, shame on me. If we make the same mistake with the emergence of AI, it will be a shame on us.

Let me be clear. I'm not against AI. As a future-facing artist, new technologies are an exciting tool that can be used to express deeper emotions, create fantasy worlds, and touch the hearts of many people. In the past year, I have developed my own deepfake version of myself that is not only trained in my personality but can also use my exact tone of voice to speak many languages. These and similar emerging technologies are highly valuable tools. This, however, is all under my control and I can grant or refuse consent in a way that is meaningful. What is not acceptable is when my art and my identity can simply be taken by a third party and exploited falsely for their own gain without my consent due to the absence of appropriate legislative control and restriction. History has shown us time and again that in moments of great technological advancement, those in the arts have been the first to have their work exploited and, in many instances, fraudulently commoditized. Soon after, it follows that the general and more vulnerable public suffer the same types of image and voice-related exploitation. By protecting artists with legislation at such a momentous time in history, we are protecting a five-year-old child in the future from having their voice, likeness, and identity taken and used as a commodity without prior consent.

I stand before you today because you have it in your power to protect artists and their work from the dangers of exploitation and the theft inherent in this technology if it remains unchecked. I'm here on behalf of all creators whose careers depend deeply on the ability to create safely in the knowledge that they can maintain tight control over their own art, image, voice, and identity. Our careers and our livelihoods are in jeopardy, and so potentially are the wider image-related rights of others in society. You have the power to change this and safeguard our future as artists and, more importantly, human beings. We are a facet of our given learned and developed identity. Our creativity is the product of this lived experience overlaid with years of dedication to qualification, training, hard work, and, dare I say it, significant financial investment and sacrifice. That the very essence of our being at its most human level can be violated by the unscrupulous use of AI to create a digital facsimile that purports to be us and our work is inherently wrong. It is, therefore, vital that, as an industry and as legislators, we work together to ensure we do all we can to protect and create an intellectual rights system as well as protect the very basis of who we are. We must get this right. You must get this right before it's too late. Thank you.

Sen. Chris Coons (D-DE):

Thank you, Mr. Crabtree-Ireland.

Duncan Crabtree-Ireland:

Thank you very much. Chairman Coons, Ranking Member Tillis, and the members of the Subcommittee on Intellectual Property. My name is Duncan Crabtree-Ireland. I'm the National Executive Director of SAG AFTRA, the country's largest labor union for entertainment and media artists, and I'm here today to testify in support of the NO FAKES Act. Our members believe that AI technology, left unregulated, poses an existential threat to their ability to, one, require consent for the creative use of their digital representation. Two, receive fair payment for the use of their voice and likeness, and three, to protect against having to compete against themselves, their own digital self in the marketplace. I'm the chief negotiator for the union's contracts, including last year's historic agreement with the major entertainment studios, which was only finalized after the longest entertainment industry strike in over 40 years, a strike that lasted nearly four months.

The strikes and the public's response to them highlighted that the entertainment industry and the broader public understand that AI poses real threats to them, and they fully support protections against those threats. For an artist, their image and likeness are the foundations of their performance. Brand and identity developed over time through investment and hard work. SAG AFTRA has long fought for the right of publicity laws and voice and image protections. The exponential proliferation of artificial intelligence technologies, technologies that allow for rapid and realistic fakes of voices and likenesses in audio-visual works and sound recordings, makes this fight urgent for our members. Enshrining this protection as a federal intellectual property right will ensure our members, creative artists, and frankly, all of us are protected, and service providers provide the same protections to individuals, images, likenesses, and voices that they provide now for other intellectual property rights. These rights should be transferable and descendible just like any other intellectual property or any kind of property someone owns, with durational limitations on transfers during one's lifetime to ensure that we don't enter into an era of digital indentured servitude.

Just as actress and SAG AFTRA member Olivia de Havilland fought to establish the seven-year rule to end long-term abusive contracts in the old studio system. Some will argue that there should be broad, categorical First Amendment-based exemptions to any legislation protecting these important rights. There are no stronger advocates for the First Amendment than our members. They rely on their First Amendment rights to tell the stories that artists in other countries are often too endangered to tell. However, the Supreme Court made clear over half a century ago that the First Amendment does not require that the speech of the press, or any other media for that matter, be privileged over the protection of the individual being depicted. On the contrary, the courts apply balancing tests which determine which right will prevail. These balancing tests are critical, and they're incorporated into the discussion draft of the NO FAKES Act.

They ensure that the depicted individuals are protected and rewarded for the time and effort put into cultivating their persona while not unduly burdening the right of the press report on matters of public interest or the entertainment media to tell stories. At the same time, these tests help ensure the depicted individual is not compelled to speak for the benefit of third parties who would misappropriate the value associated with the persona they have carefully crafted. With new AI technologies that can now realistically depict an individual's voice or likeness with just a few seconds of audio or even a single photograph, and with the constantly evolving capabilities of these technologies, it is even more important that broad categorical exemptions be avoided and that the courts be empowered to balance the competing interests. It's also essential that action be taken to address these harms now. Our members, the public, and our society are being impacted right now by the abuse of deepfake technology, and we must take timely action. Just as one of many examples of the abuse of deepfake technology. During the ratification campaign for our contract after the strike last year, an unknown party on the internet created an unauthorized deepfake video of me saying false things about our contract and urging members to vote against it. Anathema to me as someone who had devoted my life for more than a year to a contract I deeply believe in; there was no federal right to protect me, no takedown right, and tens of thousands of people were misled about something that really mattered to so many of us.

It's neither necessary nor appropriate to wait for broader artificial intelligence regulation to be adopted. This narrow and technology-neutral approach can and should proceed expeditiously forward. The companies behind many of these technologies are asking for rules so they better understand the appropriate boundaries for their conduct. The NO FAKES Act will provide them with important guidance while helping to ensure individuals are protected from exploitation that puts their livelihoods and reputations at risk. Thank you again for this opportunity to speak today, and I look forward to answering your questions.


Sen. Chris Coons (D-DE):

Thank you, Mr. Crabtree-Ireland. Mr. Sheffner.


Ben Sheffner:

Chair Coons, Ranking Member Tillis, and members of the subcommittee, thank you for the opportunity to testify today on behalf of the Motion Picture Association and our member studios about legislation to regulate the use of digital replicas. For over a century, the MPA members have employed innovative new technologies to tell compelling stories to audiences worldwide. From the introduction of recorded sound in the 1920s, color in the 1930s, and the dazzling special effects in movies like this year's Dune Part Two, the MPA's members have long used technology to allow filmmakers to bring their vision to the screen in the most compelling way possible. Artificial intelligence is the latest innovation impacting our industry. MPA sees great promise in AI as a way to enhance the filmmaking process and provide an even more compelling experience for audiences, but we also share the concerns of actors and recording artists about how AI can facilitate the unauthorized replication of their likenesses or voices to supplant performances by them, which could potentially undermine their ability to earn a living practicing their craft.

The NO FAKES Act is a thoughtful contribution to the debate about how to establish guardrails against abuses of such technology. However, legislating in this area necessarily involves doing something that the First Amendment sharply limits: regulating the content of speech. It will take very careful drafting to accomplish the bill's goals without inadvertently chilling or even prohibiting legitimate constitutionally protected uses of technology to enhance storytelling. I want to emphasize. This is a technology that has entirely legitimate uses, uses that are fully protected by the First Amendment, and does not require the consent of those being depicted. Take the classic 1994 film Forrest Gump, which depicts the fictional Forest character, played by Tom Hanks, navigating American life from the 1950s through the eighties, including by interacting with real people from that era. Famously, the filmmakers using digital replica technology available at the time had Forrest interact and even converse with Presidents, or should I say, former Senators Kennedy, Johnson, and Nixon.

To be clear, those depictions did not require the consent of their heirs, and requiring such consent would effectively grant heirs or their corporate successors the ability to censor portrayals they don't like, which would violate the First Amendment. In my written testimony, I detailed some specific suggestions we have for improving the NO FAKES draft so that it addresses real harm without encroaching on First Amendment rights. Here, I'll highlight just four points. First, getting the statutory exemptions right is crucial, and I want to thank the drafters for getting much of the way there. Those exemptions give filmmakers the clarity and certainty they need to determine whether to move forward with spending tens, even hundreds of millions of dollars on a movie or TV series. If the statutory exemptions are not adequate, some producers will simply not proceed with their projects, a classic chilling effect that the First Amendment does not allow.

Second, the bill should preempt state laws that regulate the use of digital replicas in expressive works. Simply adding a federal layer on top of the existing patchwork of state laws would only exacerbate the problems associated with inconsistent laws in this area. Third, the scope of the right should focus on the replacement of performances by living performers. Going beyond that risk, sweeping in wide swaths of First Amendment-protected speech which would make the statute vulnerable to being struck down on over-breath grounds. And fourth, the definition of digital replica must be focused on highly realistic depictions of individuals. It should not encompass, for example, cartoon versions of people you might see on shows like The Simpsons or South Park. And lastly, before legislating, MPA urges the subcommittee to first pause and ask whether the harms it seeks to address are already covered by existing law, such as defamation, fraud, the Lanham Act, or State Right of Publicity Law. Often, the answer will be yes, indicating that a new law is not necessary, and if there is indeed a gap in the law, for example, regarding pornographic or election-related deepfakes, the best solution is narrow specific legislation targeting that specific problem. Thank you again for the opportunity to testify today, and I welcome your questions.

Sen. Chris Coons (D-DE):

Thank you, Mr. Scheffner. Mr. Davies.

Graham Davies:

Good afternoon, Chairman Coons, and Ranking Member Tillis, and thank you to the committee for giving me the opportunity to speak to you today on this important issue. My name is Graham Davies; I'm President and CEO of the Digital Media Association, representing the leading music streaming services. We support the committee's efforts to bring forward legislation at the federal level, which should preempt existing state laws to keep pace with new technology. We join you in the objective of ensuring there are appropriate protections for individuals' likenesses. This is an important issue for us all, and deem a support's efforts to develop a clear and balanced way forward. Our members benefit from clarity in law and providing fans with great experiences. Indeed, our members have a strong track record of licensing complex rights to deliver music to fans. They work closely with record labels and music publishers with whom they have long relationships and robust contracts. This is our common objective, but any new or increased rights should be appropriate and targeted. They should not come at the expense of important freedoms of speech or creative expression, nor should they be overly broad to the point of creating confusion or needless litigation over the true objective of protecting personhood.

The NO FAKES Act proposes to sweep in a broad range of legitimate replicas and downstream activities within its scope. The current draft punishes good and bad actors alike. Any new rights should not undermine the global content supply chains on which the streaming industry depends. We are still in the early stages of the application of AI by the artistic community, but we see that the existing practices for taking down illegal or deceptive content continue to suffice in this new context. Streaming services are the last point in the supply chain. Only the originator of the content and the label who delivers it to the services have the information necessary to establish whether the content is legitimate or not. Streaming services do not have any way to know the complex chain of rights behind the content they receive from labels and distributors. To address the harm caused when AI technology is used to imitate a musical artist, a celebrity, or another public figure, we believe the committee's objectives would be best achieved if new legislation were developed from the existing right of publicity laws. This would have a number of advantages. Firstly, there is a body of existing case law on how First Amendment protections can be balanced with the individual's right of publicity. Second, liability sits squarely with the bad actors, those who create the deceptive content and first place it into the public sphere.

And thirdly, the focus is on commercial use with actual damages, which we believe are proven to be a sufficient deterrent. Establishing a federal law that preempts the existing patchwork of privacy and publicity laws is both beneficial and necessary. Music streaming is a global industry. We believe that rights pertaining to the person should remain inextricably tied to the individual for the duration of their life. This ensures that each person is always able to retain control of how the voice is used. The discussion draft released by Chair Coons, Ranking Member Tillis, and Senators Blackburn and Klobuchar has been helpful to foster dialogue and encourage all stakeholders to think about these complex issues. I have included more in my written testimony, which is intended to support the next stages of the discussion, and DMA looks forward to continued work with the committee. Thank you.

Sen. Chris Coons (D-DE):

Thank you, Mr. Davies. Professor Ramsey?

Lisa P. Ramsey:

Chair Coons, Ranking Member Tillis, and other members of the subcommittee, thank you for the opportunity to testify today about the First Amendment implications of the proposed NO FAKES Act. I'm a professor of law at the University of San Diego School of Law. I teach intellectual property classes at USD and my scholarship focuses on the potential conflicts between trademark laws and the rights of freedom of expression. The First Amendment of the US Constitution commands that Congress shall make no law that bridges the freedom of speech. Congress generally lacks the power to restrict expression because of its message, ideas, subject matter, or content. This rule is subject to a few limited exceptions for historically unprotected speech, such as fraudulent speech and obscenity, but content-based regulations of speech are generally presumed invalid unless God, the government, can prove the law is constitutional. The NO FAKES Act imposes restrictions on the content of speech.

It targets the harms caused by the unauthorized creation and dissemination of digital replicas or deepfakes of individuals in recordings that are nearly indistinguishable from that person's actual voice, image, or visual likeness. When the Act applies to the use of digital replicas to impersonate people in fraudulent speech or misleading commercial speech, it is consistent with the First Amendment. There's also no conflict with the First Amendment when the Act restricts the use of digital replicas and sexually explicit deepfakes without consent if those images or videos constitute obscene speech or child pornography. The problem is that the current version of the NO FAKES Act also regulates non-misleading speech that is protected by the First Amendment. Congress must, therefore, prove that the Act satisfies constitutional scrutiny. The law must be narrowly tailored to directly and materially further its goals and not harm speech that's protected by the First Amendment more than necessary.

Strict scrutiny analysis may be required when the government is regulating the unauthorized use of digital replicas in political messages, news, reporting, entertainment, and other types of non-commercial speech that are fully protected by the First Amendment. As it is currently drafted, I believe the NO FAKES Act is not consistent with the First Amendment because the law is overbroad and vague. However, I think a revised version of the law could satisfy intermediate and strict constitutional scrutiny. There are three ways that Congress can better protect First Amendment interests in the law. First, it's critical that the law not suppress or chill protected speech more than necessary. The Senate's proposed NO FAKES Act does a better job than the No AI Fraud Act in setting forth specific exceptions from liability for certain non-confusing uses of another's image, voice, or likeness, the law can still be improved in certain ways that I discuss in my written testimony.

It is also important that Congress not enact a strict liability rule for online service providers that host expressions covered by the NO FAKES Act. Specific and actual knowledge of the direct infringer's use of an unauthorized digital replica should be required for liability. Online service providers should implement a notice and takedown system to make it easier to remove unauthorized deepfakes that violate the law. Accused Infringers must also be able to challenge takedown requests by filing a counter-notification with the platform. My second recommendation is for Congress to create separate causes of action that target the different harms caused by unauthorized uses of digital replicas. This includes, number one, the use of deepfakes to impersonate individuals in a deceptive manner. Number two, uses of sexually explicit deepfakes. And number three, uses that substitute for an individual's performance that they typically would have created in real life, such as a performance in a song or movie.

These causes of action should have different requirements and distinct speech-protective exceptions. My third recommendation is that Congress ensure each provision of the law adequately protects speech interests. Congress can better protect expressive values by allowing the new Federal Statute to preempt the inconsistent state laws that protect the right of publicity and digital replica rights or laws that restrict the unauthorized use of digital replicas. If licensing of digital replica rights is allowed by the Act, individuals should be able to consent for each different use of their digital replica; allowing others to control a person's identity rights through a broad licensing agreement will work at cross purposes with many of the stated goals of this proposed legislation. It could potentially lead to greater AI generated deception of the public. It can also stifle the right of people to make a living through their performances and result in the use of their image or their voice in sexually explicit material that was authorized by the broad terms of a licensing agreement. I encourage Congress to continue to protect the interests of both public figures and ordinary people in the NO FAKES Act, and I also encourage you to continue consulting with stakeholders, academics, and attorneys with expertise in this field of law. I look forward to answering your questions as you continue to improve the Act. Thank you.

Sen. Chris Coons (D-DE):

Thank you. Thank you to all six of our witnesses for your preparation, and your engagement. I'm going to start with some questions about exploring how AI replicas are impacting individuals and entertainment businesses and then use a subsequent round to get into your perspectives on specific potential revisions to the NO FAKES Act. Mr. Crabtree-Ireland. Thank you for sharing your personal experience of an AI-generated deepfake. This was in the context of the ratification fight for the most recent contract. Given your experience, should a digital replica right apply to all individuals, regardless of whether they're commercializing their image, voice, or visual likeness, you primarily represent people who make a living commercializing their image, voice, or visual likeness. Why should we have this available to everyone?

Duncan Crabtree-Ireland:

No, it's a great question, Chairman, and I think, yes, we support a right that's available to everyone. Obviously, Twigs, myself, others, Mr. Kyncl have explained the impact that this can have on people who make a living and whose career is based on their image, likeness, or voice. But the impacts are so obvious and so real for so many Americans outside of the scope of just a commercialized use. And the example I gave in my mind is not a commercial use example. This is an example that could apply to anyone, and the impact is so serious. So yes, we do support this right on a broader basis, and it should be applicable to everyone.

Sen. Chris Coons (D-DE):

Twigs, could you help us understand how you're using AI as a creative tool on the one hand, and then briefly tell us a little bit about your experience with AI deepfakes and what you think the future of your industry looks like if we don't heed your urgent call for us to act.

Tahliah Debrett Barnett (“FKA twigs”):

Well, over the past year, I've been creating an AI version of myself that can use my tone of voice exactly to speak in multiple languages. I've done this to be able to reach more of my fans and to be able to speak to them in the nuance of their language. So, I've currently explored French, Korean and Japanese, which is really exciting for me. It means that even with my upcoming album, I can really explain in depth what it's about creatively. It also allows me to spend more time making art. Often being a music artist or any artist in this day and age requires a lot of press and a lot of promo, a lot of one-liners. So it means if it's something simple that doesn't really require my heart, I can do a one-liner and give it to people to promote a piece of work, and it's, it's harmless, but ultimately I can spend more time making something that's really meaningful for my fans. And the next question, oh, you asked how your..

Sen. Chris Coons (D-DE):

Experience with deepfakes?

Tahliah Debrett Barnett (“FKA twigs”):

Yeah, so there are songs online, collaborations with myself and other artists that I didn't make. It makes me feel vulnerable because, first of all, as an artist, I think the thing that I love about what I do is that I'm very precise. I take my time with things, and it's really what I'm very, I am very proud of my work, and I'm very proud of the fact that I think my fans really trust me. They know that I just put so much deep meaning of my North Star into what I do. So the fact that somebody could take my voice, change lyrics, change messaging, maybe work with an artist that I didn't want to work with or maybe work with an artist that I wanted to work with, and now the surprise is ruined, it really leaves me very raw and very vulnerable. I think that if legislation isn't put in place to protect artists, not only will we let artists down who really care about what we do and have spent a long time developing themselves, developing the way that we work, but it also would mean that fans wouldn't be able to trust people that they've spent so many years investing in. It would affect us spiritually, it would affect us financially, and it makes some. Honestly, if I'm honest with you, I'm just surprised that we're even having this conversation because it feels so painfully obvious to me that it's hard to even find the language if I'm completely honest with you.

Sen. Chris Coons (D-DE):

There are a lot of things that painfully obviously call out for Congress to activate. So your surprise is not unusual.

Tahliah Debrett Barnett (“FKA twigs”):

But ultimately, what it boils down to is my spirit, my artist, and my brand. I've spent years developing it, and it's mine. It doesn't belong to anybody else to be used in a commercial sense, cultural sense, or even just for a laugh. I am me. I'm a human being, and we have to protect that.

Sen. Chris Coons (D-DE):

Thank you. Mr. Kyncl, if I might just briefly, we've seen a steady increase in the quality of deepfakes, with songs on streaming platforms that are virtually indistinguishable from talented artists like Twigs. What are the challenges that AI deepfakes are creating long-term for both the music business and fans as well as performers?

Robert Kyncl:

So I think Twigs addressed one of those, and no one can do that better than what she just did. I think the second one is that when you have these deepfakes out there, the artists are actually competing with themselves for revenue on streaming platforms because there's a fixed amount of revenue within each of the streaming platforms. And if somebody is uploading fake songs of Twigs and those songs are eating into that revenue pool, there is less left for her authentic songs. So that's the economic impact of it, long-term and the volume of content that will then flow into the digital service providers will increase exponentially, which will make it harder for the artists to be heard and actually to reach lots of fans. So creativity over time will be stifled.

Sen. Chris Coons (D-DE):

So, as you both put it, there's a relationship impact, a spiritual impact, a financial impact, and a broader ecosystem of creativity impact. Senator Tillis, I turn to you.

Sen. Thom Tillis (R-NC):

Thank you, Chairman Coons, and again, thank you all for being here. Ms. Ramsey, I'm going to start with you and then have others who may have an opinion on it. You mentioned notice and take down in your comments, this is a strict liability bill in its current form. Some of us think that maybe we have to wade into that. You also talked a bit about having; I guess that having the individual who's been informed of takedown having some recourse. Can you talk a little bit more about that briefly?

Lisa P. Ramsey:

Sure. You might have a situation where somebody challenges your own personal use of your identity online, and they're the one that's the bad actor, but they file a complaint with the online service provider, and the online service provider who wants to avoid liability automatically takes it down. That's one possibility. Another would be that the person who is disseminating this image actually has a defense, right? An exception applies to this particular use. It might be news reporting or parody. And so, it's critical for the online service provider to be able to put that expression back up if it actually does not violate the law under the copyright laws. My understanding is that once the information's put back up, it stays up unless the copyright owner files a lawsuit. So this is what's great about the takedown and notice procedure is it allows ordinary people to get these unauthorized uses off the internet, and I think that's one real benefit of having a notice and takedown procedure and encouraging companies to adopt one. There are some challenges, though, with notice and take-down procedures that folks like Eric Goldman and others have talked about. So it's great that you're talking to interested parties when you figure out these issues.

Sen. Thom Tillis (R-NC):

Does anybody here have an opinion contrary to that? Okay. Mr. Kyncl, can you walk me through what rights are typically granted to record labels under exclusive sound recording agreements and whether likenesses are included in those?

Robert Kyncl:

So it's a pretty wide range of rights anywhere from a full copyright right to distribution only rights where the copyright remains with the artist, and increasingly so they include likeness as well because we're, as you can imagine, as we work on open platforms with lots of user-generated content, we are the ones who have a staff of people that are working to issue notices, claim the content, take down the content, and increasingly we need the name image, likeness, and voice rights in order to actually act on that, on the audit's behalf with the platforms.

Sen. Thom Tillis (R-NC):

I think you believe that new digital replica rights need to be fully transferable?

Robert Kyncl:

Yes.

Sen. Thom Tillis (R-NC):

Why isn't a license enough?

Robert Kyncl:

I think it should be the artist's choice. The artist should have a choice to either transfer or license.

Sen. Thom Tillis (R-NC):

Okay. Mr. Sheffner, state-level right of publicity laws restricting commercial speech has existed for many decades. They've developed their own case law and they're well understood. The new digital replica right, proposed by the NO FAKES Act, would affect non-commercial speech beyond what most state laws currently cover. Can you explain how novel this proposed right would be in the context of existing right of publicity laws and how we should consider preempting similar state-level digital replica laws, especially when it's such a new territory?

Ben Sheffner:

So thank you for the question, Senator Tillis, and you're absolutely right that most state right of publicity laws, which have existed for more than a century, is really limited to commercial uses that, 's in advertisements or on merchandise. What Congress is considering doing here is really novel, although it's sometimes described as the right of publicity. We think it's fundamentally different in that it would apply to expressive works like movies, TV shows, and songs, which are fully protected by the First Amendment. So there has developed a robust body of case law in the traditional commercial right of publicity context, which says, yes, it applies if you put somebody's face on a billboard or use it in an advertisement on a lunchbox, but it doesn't apply if, for example, you're making a biopic or a docudrama about somebody you can't use the right of publicity law to censor those portrayals. Again, this is a novel form of right, and it is going to be subject to heightened constitutional scrutiny, as Professor Ramsey described because it applies to expressive works. It's really important upfront to provide some clarity to film producers so that when they're about to embark on a project, they know what's allowed and what's not. And if it's too vague, if it's too uncertain, they're going to shy away from using this technology to engage in those sorts of portrayals. Again, that chills speech and the First Amendment case law says that a statute is vulnerable to being struck down if it chills constitutionally protected speech.

Sen. Thom Tillis (R-NC):

That's why we absolutely have to get it right. I think there's a general consensus that we have to make progress but given the challenges of all this work being struck down and our significance, we have to do the legwork. Thank you. I'll have a second round.

Sen. Chris Coons (D-DE):

Great. Thank you. Senator Tillis. Senator Hirono.

Sen. Mazie Hirono (D-HI):

Thank you, Mr. Chairman and Ranking Member Tillis, for bringing this Bill before us. And as you say, Mr. Chairman, that the bill has gone through a lot of input from a lot of different groups, and if I listen to your testimony accurately, it doesn't sound as though any of you think that we should not do something that will protect, I like the framing of protecting personhood. Do any of you think that we don't need to do anything in this area? Okay. So, looking at this statute, then, why don't we go down the list very quickly? What do you like most about the current bill? And we'll just start with Mr. Kyncl about the current bill and what is the most important thing you would want to change, if anything. And if you can, just keep your answer really short.

Robert Kyncl:

I'll start with what I believe it needs to contain, which is consent for the use of people's names, likenesses, and voices to train AI models and create outputs that need to happen. Second, it needs to contain monetization, which is a fair market license that that person can exercise through that consent. But in order for that to happen, and in order for all of that to be operationalized by the platforms, we need two things. One is the provenance of the content that generative AI models are trained on and that they're outputting to be retained, which means they should keep sufficiently detailed records on what they trained on so that later on, that provenance can be embedded into the watermarks, which are recognized by the platforms on which the content is.

Sen. Mazie Hirono (D-HI):

The point is it sounds like consent is the critical part of this consent of the creator.

Robert Kyncl:

And provenance of the provenance of the content, and we are good at tracing provenance on luxury clothing, on cheese, on wine. We should be able to do it on intellectual property as well.

Sen. Mazie Hirono (D-HI):

Going down the line, we're talking about this particular bill. Is there something in this bill that you think is the most critical aspect of the bill that you support, and is there anything that you would change in the bill?

Tahliah Debrett Barnett (“FKA twigs”):

I think the most important thing is to put the power on artists. I want to be in control of my likeness, my brand, my legacy. I have sacrificed so many years to be good at dancing, at singing, so much financial input, and so much time, and I do it in the name of my legacy. I do it so that one day, I can look back at my body of work and say that it was me to be protected in the bill.

Duncan Crabtree-Ireland:

Thank you, Senator. I think what I like most about this bill is the fact that it is broader than limiting it to commercial use. The fact is the commercial use limitation may have worked a hundred years ago. A commercial use limitation does not solve the problems that we face today, especially because of generative AI. And we need the breadth that is reflected in this legislation. I think in terms of if there were one thing that I would change in it, I would adopt a durational limitation on transfers or even licenses of these rights during a lifetime; I think may not be as necessary after death but during a lifetime. I think it's essential in order to make sure that someone doesn't improvidently grant a transfer of rights early in their lifetime, that turns out to be unfair to them. And I think there are various standards we could look at for an appropriate direction.

Sen. Mazie Hirono (D-HI):

So 70 years is a bit long.

Duncan Crabtree-Ireland:

Well, I'm sorry. The 70 years, though, is the duration of the right in the bill after death. I'm talking about the duration of a transfer, even during life. So if you had, say, a 21-year-old artist who's granting a transfer of rights in their image, likeness, or voice, there should not be a possibility of licensing that for 50 years or 60 years during their life and not have any ability to renegotiate that transfer. So I think there should be a shorter, perhaps seven-year limitation on those transfers.

Sen. Mazie Hirono (D-HI):

Makes sense.

Ben Sheffner:

So Senator, one thing that we do like about the bill is the First Amendment exemptions. We think they're most of the way there to giving our members the clarity and certainty they need. I think they can be improved a little bit, and we have some specific, fairly technical changes that we recommend. One thing that we would recommend changing is there's currently essentially a no preemption provision. We think it should essentially be the opposite.

[recording interrupted]

Graham Davies:

Thank you. The question, I think..

Sen. Mazie Hirono (D-HI):

Chairman, do you mind if we just continue with the responses?

Graham Davies:

Yeah. Building on some of the things said, I think the efforts here to protect personhood is something we very much encourage with the draft. The fact is that it's a discussion draft. I think in terms of the key areas that we want to focus on is where liability sits and that we would encourage it to be focused on the creator and those that are first releasing the content. We would prefer that it was based on the right of publicity laws, the existing body of law rather than IP, and actual damages rather than statutory damages. And the preemption message.

Sen. Mazie Hirono (D-HI):

Professor.

Lisa P. Ramsey:

Alright. Thank you for your question. So what I like most about the bill. I love the specific exclusions from liability, even though there might be some additional changes and the fact that you're protecting personhood, although I'll note that state right of publicity laws do sometimes apply to non-commercial uses of a person's identity. The Zucchini Case involved the use of his entire act in a news report. So that's not a commercial use. And in the comedy California, the Supreme Court case was applied to an identical kind of rendition of the Three Stooges in a lithograph, which is also not commercial speech. So there are some circumstances where current laws do apply to non-commercial uses of a person's identity. And also false endorsement laws can apply to non-commercial uses, but they have to be used in connection with goods and services. So they might be non-commercial use and goods and services. What should we change? Well, I think there's a tie. You said to pick one, but I have to pick two.

[recording interrupted]

So first, I think we should have separate with distinct disclaimers might targeting deceptive and of someone because it dispels any confusion, but a disclaimer doesn't make sense and it's a sexually explicit deepfake that's been put out there without consent. You might have different requirements with regard to commercial use, right? If it's just a general publicity, you might have it apply to both commercial and non-commercial speech.

And then my other part of the tie is the provisions with regard to limits or no limits on the scope of licensing. My concern is that individuals without significant bargaining power at the early stages of their career might sign a contract, maybe a long contract where it has a digital rights provision in it, and sign away the right to their identity for a lengthy period of time and use in any context. And so I would like to see some way for Congress to encourage or require those folks who are negotiating these agreements to perhaps have a specific use authorization for a certain movie as opposed to use of your digital identity in any context, right? Or instead of a lengthy period of time instead of 56 years, I would say maybe one to five years. And I'm not an expert in the area of what's a good term, but I think it's critical to make that shorter rather than longer because a lot of these are not going to have that.

Sen. Mazie Hirono (D-HI):

Thank you.

Sen. Marsha Blackburn (R-TN):

Thank you Mr. Chair. We've spent on moving, so please we are to the hearing stage on this. Now I represent Tennessee, so it doesn't matter if you're on Beale Street or if you're on Music Row or maybe you're working with Naxos or one of the symphony distributors. We distribute more symphonic music out of Nashville, Tennessee than anyone else. We've got gospel, got church music. [recording error] … I appreciate the comment you made when we were visiting, preparing for the hearing. You said we got data wrong, data privacy wrong. We still haven't done data privacy and we can't afford to get AI wrong and it's going to require that we take an action, this act of what we, I think that baseline for a federal action. So I'd like to hear from you, if you will, sir, about the need.

Robert Kyncl:

We are in a unique moment of time where we can still act and we can get it right before it gets out of hand. Genie is not yet out of the bottle, but it will be soon. And as Sen. Blackburn said, we got it wrong. We waited too late. Don't get it wrong. The speed at which this will happen will be afforded by open-sourcing of foundational AI models. Everything accelerates exponentially and therefore, it's imperative that Congress acts this year. Thank you.

Sen. Marsha Blackburn (R-TN):

We've heard some commentators talking about, well, you've got existing law when it comes to privacy or to personal property and intellectual property protections, and so you can rest on that existing law, and that is sufficient to go in and get a take-down order on some of these AI fakes. Talk to me about why that is not sufficient.

Robert Kyncl:

I mean, today, if you think about privacy, how many spam emails do you get every single day in your inbox? Quite a lot. Your personal information is leaking everywhere, whether it's being sold or whether it's just being taken, it's just not safeguarded properly. When that happens with your face and your voice, it's a whole new game for you. And this will happen at a volume that is impossible for every single person to try to manage personally, which means it has to be solved with technology. So it is technology that will unleash it, and it has to be technology that helps manage it, which is why it's important for us to work with the technology platforms to solve this and that we have to have a working bill and working law that can be operationalized by all of us, but it's the existing framework is simply to whack-a-mole and doesn't work.

Sen. Marsha Blackburn (R-TN):

Let me ask you this, Mr. Chairman: If I can get one more question, do you think that the platforms should be held responsible for unauthorized AI fakes that they're continuing to allow to be distributed?

Robert Kyncl:

I think we need to develop the sort of conditions that they should meet, and then if they don't, then yes, but there has to be an opportunity for them to cooperate and work together with all of us to make it so, and that I think is the detailed work that needs to happen. But when we achieve that, then it'll work and there'll be good actors and many of them are. So I think it's through that collaboration that we can wrestle this down.

Sen. Marsha Blackburn (R-TN):

Thank you. Thank you, Mr. Chairman.

Sen. Chris Coons (D-DE):

Thank you, Senator Blackburn, and thank you for your cooperation in moving forward this great bill. I have a whole series of questions I want to ask about potential tweaks. So, I'm going to try and move relatively quickly if I might. Mr. Sheffner, you testified we have to include First Amendment exceptions for uses in works that have a public interest or newsworthy value. Some people say that any work involving a celebrity is newsworthy or in the public interest, and that raises the challenge of how we define First Amendment exceptions to ensure they don't just swallow up the rule and permit all kinds of uses that the bill's, in fact, trying to stop. So I'd be interested in your views on how we narrow that and Professor Ramsey, how would you craft the First Amendment exceptions to make sure that they don't swallow up the whole bill, particularly with regards to what is newsworthy?

Ben Sheffner:

Sure. So Senator Coons, we've talked to your staff, which we have a great relationship with, have talked to other stakeholders and listened to the concerns that have been raised about, well, maybe these exceptions are overbroad, and they could somehow swallow the right itself. We've listened, and we've suggested tweaks to make sure that those types of exceptions do not apply. If the use of the digital replica is deceptive fraud, we don't support fraud. Fraud is not protected by the First Amendment; it should not be allowed. But one other thing I would just say is that these types of statutory exemptions have been routinely included in state right of publicity laws over the last 25 years or so since the late nineties. And one thing that we've seen or that we haven't seen is this type of abuse of those exceptions. They've worked very well in separating out the uses where you should need to get permission again to put somebody's face on a billboard or on a lunchbox versus the biopics.

Sen. Chris Coons (D-DE):

Got it. Professor briefly.

Lisa P. Ramsey:

So Christine Farley and I just recently wrote a paper about how we can balance trademark and free speech rights when someone uses a trademark in an informational or expressive way like a news report, entertainment, or things like that. And I think our proposal in that context might also work here. So as you mentioned, some of these kinds of uses can actually be bad impersonation, et cetera. So one approach would be to, in addition to listing out these potential defenses, would be to say if this is an informational expressive use that is a false statement or false representation, so you actually say this is a certain celebrity when it's not, or a certain teenage girl when it's not, that would be actionable still, even though there's some argument that it's expressive or if this use is likely to mislead a reasonable person about the source of the message or the speaker's identity. And so that way, you would be able to at least have courts consider whether it's an information or expressive use, but also the safety valve that if it's really causing harm because it's deceptive, then you can still regulate it.

Sen. Chris Coons (D-DE):

Understood. Mr. Davies, today you raised concerns that the bill lacks a mechanism for showing or demonstrating your members had knowledge. Should we incorporate a notice and take-down structure? If so, should it be the DMCA's notice and take-down provisions? Is there another mechanism you'd urge us to consider for knowledge and for construction?

Graham Davies:

Thank you for the question. In terms of the current situation our members are handling, they're the leading streaming services, so they're handling the majority of the music streaming consumption, and the processes are working very well. I think the example that you used and the other Drake example, which is a common one, there's been no challenge there in taking down the content expeditiously. So we don't see our members needing any additional burdens or incentives here, but we do understand that the committee is keen to look at whether there is to be a secondary liability; we would very much seek that there'd be a safe harbor for an effective takedown. I think the DMCA notice take-down process; we don't see it as being a good process for here. It was designed for copyright, and we obviously have opposition in terms of seeing this as a different set of rights.

That said, our members absolutely can work with the committee in terms of what we would think would be an effective notice and take down and building on some of the points the professor has made there in terms of it's really essential that we get specific information on how to identify the offending content so that it can be removed efficiently. We need information on the notifier in terms of why the content is offending and on what basis, and also that information on the notifier so that if there is an objection to the notification, that can take place.

Sen. Chris Coons (D-DE):

Two more questions. If I might, I want to talk about preemption briefly, Professor. Several witnesses have described the existing state right of publicity laws as a difficult-to-navigate patchwork. Should our bill broadly preempt state laws or limit preemption to those state laws governing the use of unauthorized digital replicas?

Lisa P. Ramsey:

So, I teach the right of publicity law, trademark law, and intellectual property survey, and every year, I teach. I think we really need a federal right of publicity law, right? The state laws are so different, and if you go to; Jennifer Rothman has this great blog that talks about all the different laws, and then even within a state, the statutory provisions have different rules than the common law provisions.

Sen. Chris Coons (D-DE):

I take it your answer is yes.

Lisa P. Ramsey:

Well, yes. So, I'm just building it up. So yes, we need preemption, right? But the challenge right is obviously Congress, you're doing a great job trying to get this right, and so you get it right, and then you preempt state laws, and it simplifies everything for litigants, for judges, instead of having to figure out which law's going to apply and a particular, there's right now forum shopping going on, people are going to file suit in whatever state's going to be best for their interests. And so if you have, so yes, we need pre-emption is a great idea.

Sen. Chris Coons (D-DE):

The discussion draft has a 70-year postmortem provision. It's modeled after the Copyright Act. Postmortem rights are important, but we understand 70 years is a long time, especially for individuals who don't commercialize their image, voice or likeness. I'd be interested, jump in all several of you, perhaps Mr. Sheffner first and then others. Should postmortem terms be longer for individuals who commercialize image, voice, and likeness? Should they be limited? Should they be reviewed and extended every decade or so? How would you handle postmortem rights? The draft has 70 years postmortem and some of you have enthusiastically supported that as part of your creative legacy. Others have raised concerns. Mr. Sheffner, you kick us off, and we'll do this one quickly.

Ben Sheffner:

Sure. So, we view this again through the lens that this is a content-based regulation of speech. And as Professor Ramsey said in her opening statement, a content-based regulation of speech needs to be justified by a compelling government interest that is narrowly tailored to serve that interest. And what we have said is that as for living professional performers, use of a digital replica without their consent impacts their ability to earn a living. You have a compelling government interest in regulating there, and it would be appropriate for Congress to regulate post-mortem. That job preservation justification goes away. And I have yet to hear a compelling government interest in protecting digital replicas once somebody is deceased. So I think there are going to be serious First Amendment problems with extending a right that applies in expressive works postmortem.

Sen. Chris Coons (D-DE):

Any other witnesses who think preserving an individual's legacy and property rights is worthy of some protection? Professor Ramsey and then Mr. Crabtree-Ireland.

Lisa P. Ramsey:

So, this is not going to shock you, but I'm going to say it depends on the goal of the law.

Sen. Chris Coons (D-DE):

You really do belong as a Professor.

Lisa P. Ramsey:

So if we're talking about a law that's regulating deceptive uses of someone's identity, if we're talking about a law that is governing sexually explicit deepfakes, it seems to me that it's fine to have a post-mortem, right? Long-term, possibly maybe life plus 70. If we're talking, though, just about the protection of a broad federal right of publicity, maybe not so much, and I haven't written in this area, but I would recommend looking at the works of people who have Mark Bartholomew, Jennifer Rothman I think is working on a paper, etc.

Sen. Chris Coons (D-DE):

Mr. Crabtree-Ireland?

Duncan Crabtree-Ireland:

Thank you. I mean, to me, it's shocking that anyone would think that this right doesn't deserve to be preserved and protected after death. I mean, for all the reasons that Twigs stated about how personal this is, it's an economic right, it's a personal right, and it's something that has real value. And so, why that should somehow dissipate upon death and make itself available to big corporate interests like the ones represented by some folks here. That doesn't make any sense. I would argue that there shouldn't be a 70-year limitation at all. This right should be perpetual. And the reason why this right should be perpetual is that every one of us is unique. There is no other Twigs, and there never will be. There is no other you or any of us. This is not the same thing as copyright. It's not the same thing as we're going to use this to create more creativity on top of it later. This is about a person's legacy. This is about a person's right to give this to their family and let their family take advantage of the economic benefits they worked their whole life to achieve. So, from my perspective, this is an intellectual property right that deserves protection. It should absolutely be protected after death. And I am waiting to hear a good reason why it shouldn't be, to be honest with you.

Sen. Chris Coons (D-DE):

In perpetuity, not at all. Mr. Kyncl, see if you can help us bring this home.

Robert Kyncl:

Will make this very brief for you. I agree with Mr. Duncan Crabtree-Ireland. 100%.

Sen. Chris Coons (D-DE):

Thank you all for that. Twigs, would you like to comment on that? Forgive me.

Tahliah Debrett Barnett (“FKA twigs”):

I was going to say that I've worked so hard throughout my career, and when I die, I would like everything that I've created to go to my family and my estate, which will have clear instructions on how I want to preserve my history and all of the art that I've created.

Sen. Chris Coons (D-DE):

Thank you, Senator Blumenthal.

Sen. Richard Blumenthal (D-CT):

Thank you very much, Mr. Chairman. I got off a plane about 20 minutes ago coming from Connecticut, so I do apologize for missing the bulk of the hearing. As you may have heard, we had no votes yesterday, so today was a partial day off and I had plans in Connecticut. So I am grateful to all of you for being here, and we are very, very hopeful that you're in good health and that you're going to continue creating. And I'm a big fan of your work, so thank you for being here particularly. Thank you, Mr. Chairman, for having this hearing, which focuses on a bill that you're going to introduce. I'd like to be added at the appropriate time as a co-sponsor. I'm a strong supporter and I believe that there ought to be a federal right for people whose image and voice are used without their consent, whether it is an actor or a songwriter or a singer or an athlete. What is shared here is a right in one's own likeness and creation as a person and an individual right. And I think there ought to be a right to take legal action under that, right? A right without a remedy is unveiling. As we know from our first year in law school, which for me was quite a few years ago, but I've seen it repeated again and again in real life as a prosecutor, as an advocate, and as a litigator. But I'd also like to focus on a complimentary remedy, which could be watermarking or identification, attribution, or giving credit.

Not just the deepfake and the right to recover as a result of use of it without attribution or credit, so to speak, without a watermarking, but also that kind of identification, public crediting of a work. And I'm asking not only in the abstract, but I had a different subcommittee of the judicial committee. It's called Privacy Technology and the Law; the Ranking Member of that Subcommittee and I, Senator Josh Hawley of Missouri, have set forth a framework. It's the most comprehensive bipartisan framework right now, and we should do more adopting the kind of measure that Senator Coons and others have proposed, but it would provide a requirement for watermarking as well as an entity to oversee licensing, mandatory licensing risk-based of AI models and other measures like transparency and so forth. It's a more comprehensive approach, but my question is really focused on watermarking. Maybe you can tell us. Let me ask all the witnesses how watermarking can complement rules requiring permission to use someone's likeness or voice or creation in a deepfake or an impersonation or simply using it without permission.

Robert Kyncl:

Maybe I'll take it. So, thank you for all of your work on this important issue. I think without attribution achieved through watermarking, we won't be able to operationalize what we're talking about here today. So you're focusing on absolutely the right issue. And I think the important part in this is to determine the provenance of content that's being displayed, the degrees of similarity to its original, to the original, and then it is up to the rights holders, whether it's artists, music companies, movie studios, et cetera, to then negotiate commercial relationships with the platforms separate and aside from the laws and how it all works using all of those mechanisms. We've actually done this when I was at YouTube. This is precisely what we have done with user-generated content. We've just done it in the copyright scheme where it was the exact content referenced, and so we built a whole framework around that. This is merely that on steroids adopted for the AIH with many more shades of gray and much more speed. But it's really just upgrading that. But the framework exists. It has been developed by companies like YouTube, which is best in class in that, and therefore, I'm hopeful that we can take it further and apply that to AI as it relates to voice and degrees of similarity using watermarks to label content and carry the providence of it.

Sen. Richard Blumenthal (D-CT):

Thank you.

Tahliah Debrett Barnett (“FKA twigs”):

I mean, I can only really talk from personal experience that in the last six months, I had 85 of my songs leaked online, which was basically the whole of my experimentation for my next album. It was really scary because I felt that it was like having the whole of my notepad, I guess, of all my ideas being put out to the whole world before it was ready. But on the flip side of that, I felt very secure because I was able to call up my label and say, hey, this has happened. And immediately they could go and take it down, and it just disappeared, and now you can't find it. So I think that watermarking would protect artists because then we'll have a point of call to go to say, this has happened, and immediately whatever's been leaked online or put online can be taken down. But one thing I will say is that the thing that's really scary is once something is out in the world, we can't take it back. So, if someone uses my likeness and says something offensive or says something harmful, people might think that that is me. And we've all seen in the news when someone does something wrong, and the big story is like the front page, but then you think, oh no, they actually didn't do something wrong. It was a mistake. And the rewrite of it is so small, and I think that's the thing that I'm scared about is even if something does get out in the world that's not me, it's the reputational damage that it will do and the financial and cultural harm that won't be able to be amended after the fact.

Sen. Richard Blumenthal (D-CT):

That's a very good point. If the Chairman would give me a little more time, I'd be interested in the others' answers. Thank you.

Duncan Crabtree-Ireland:

Thank you. I agree with Mr. Kyncl on the value of watermarking and other tools as well. C2PA, the coalition is working on that, but I also just want to caution, especially in deepfakes, it was mentioned earlier the idea of disclaimers solving problems there or the ideas of watermarking solving problems there. We also have to make sure that the tools that we use to protect against abuses of these technologies are realistic. And so expecting viewers of content online to read deeply into captions to find disclaimers or things like that, that doesn't really solve this problem. So I hope as the committee considers what to do, it's not enticed into thinking that that type of solution actually solves the problem. It needs to be more front-facing so that the message that's delivered is received by all those who view it.

Ben Sheffner:

So thank you for the question, Senator Blumenthal. As Mr. Kyncl was talking about in the copyright context, watermarking has proved useful in certain contexts. I think he was referring to YouTube's content ID system, which has been a great help in reducing the presence of pirated material on that platform. I would just say again, our experience from copyright law, though it's not a silver bullet, it sometimes can help identify the original source of pirated material, but just because it has a watermark on it doesn't stop it from being further disseminated, et cetera. So, there are really no silver bullets in this context.

Graham Davies:

Thank you for the question. I'm going to build on things that have already been said. I think Robert talked about the partnerships between the services and the rights holders. These are absolutely essential. This is where the content comes from for the services. So we're very reliant on the data, on the metadata that exists. I think it'll be true to say that data and the music industry already have significant challenges, but these are challenges we work on together.

Sen. Richard Blumenthal (D-CT):

Thank you, professor.

Lisa P. Ramsey:

So I'll incorporate by reference everything that's been said before, but then also say that I think someone using a digital replica to impersonate someone or basically put out a sexually explicit deepfake, they're not going to use this kind of technology, so it's not going to help in certain circumstances.

Sen. Richard Blumenthal (D-CT):

Yeah, and I meant this. I think I used the word complimentary. If not, I meant to say complimentary. I didn't mean it as a substitute. So, I take all these comments as very helpful and valid. Thank you, Mr. Chairman.

Sen. Chris Coons (D-DE):

On behalf of the Chair, Senator Klobuchar.

Sen. Amy Klobuchar (D-MN):

Thank you very much. That was an AI attempt. I know it was kind of failed, kind of close, not quite. Okay. Professor Ramsey, since you ended there. I'll pick up where you were about some of these, and some of the other witnesses mentioned about these deepfakes and how some of these things, whether it's sexually explicit images or whether it is the political robocalls or videos or ads, and I wasn't going to start this way, but it makes sense here because what you just said to me, some of this, we just have to get off there. They're not going to be able to listen to a major candidate for President for three minutes and then look and see a label. And I think that in other countries, that's what they've done. That's why Senator Hawley and Senator Coons and Collins and a number of other senators have come together. We are marking up this bill along with a labeling bill in the Rules Committee on elections. Could you talk about why that kind of targeted approach to these hair-on-fire things is very important, given the timing of all of this?

Lisa P. Ramsey:

As you can expect, I love the fact that you're working on these targeted laws, but again, one of the things we need to do is protect ordinary people from impersonation. Over Thanksgiving, someone called my dad when I was standing right next to him, it sounded just like my brother. And he said he was in jail and he needed money to get out of jail. And my dad was not duped by this, but the fact some people have been, as the senators have noted. So I think it's a great idea, but I think that we still need the more broad Act to deal with these kinds of issues for folks that are not politicians, et cetera.

Sen. Amy Klobuchar (D-MN):

Exactly. And my State Director's son is in the Marines, and her husband got a call where it was an impersonation that scraped his voice. They didn't know where he was stationed. So we're going to see all of this deployed against military families as well, really all these kinds of scams. So it's going to be, I see this heaven of some of the great uses, especially in healthcare of AI, but then there's the hell part, and that should be our job to try to put in the guard rails in place, which is why I'm so honored to be working with Senator Coons and Tillis and Blackburn on this bill. So one of the things that interests me during the testimony you, Mr. Sheffner and Mr. Crabtree-Ireland, you kind of got to this, but both the NO FAKES Act and this Election Bill include exemptions, exceptions for the use of digital replicas to ensure the bills do not show speech protected by the First Amendment. Could you talk a little bit more as we look at how we can write these in a way I have tried with exceptions for satire and the elections bill with Senator Hawley, how can we do this to ensure that common sense safeguards do not chill protective speech and that this is upheld in a court?

Ben Sheffner:

Right. So Senator Klobuchar, I just want to say, agreeing with Professor Ramsey, that I think your approach of having specific legislation on pornographic deepfakes and other legislation on election-related deepfakes is really the right way to go. When you have a broad bill that essentially says you need permission to use digital replicas and then let courts sort it all out, that's where you get into trouble, and you have an overbroad bill that is going to necessarily end up encompassing, protected speech, which makes it vulnerable to being struck down on over breath grounds. So these kinds of exceptions, I think, are specific to the type of legislation in the world of movies. Our studios, the studios that we represent at the MPA, make a lot of movies that are based on or inspired by real people and events. In the last five years of all the best picture nominees over the last five years, approximately half are based on or inspired by real people and events. Our studios want to make sure that legislation like this doesn't interfere with their ability to do that. But when you're talking about, say, non-consensual, pornographic deepfakes, you don't need those exceptions for biopics and satire and parody. That stuff is bad in almost every circumstance you can think of. And I think this narrowly targeted approach is really the right way to go.

Sen. Amy Klobuchar (D-MN):

Okay, so Mr. Duncan Crabtree-Ireland, you have the best long name in the world.

Duncan Crabtree-Ireland:

Thank you.

Sen. Amy Klobuchar (D-MN):

Could you talk about balancing that right of creators with the right of those whose voice or likeness may be at risk sitting next to one of them right there with Twigs? And how do you believe we should balance that?

Duncan Crabtree-Ireland:

Absolutely. I think we all agree that, obviously the First Amendment has to be protected and that expressive speech is important. I think the exceptions that are written into this discussion draft now are not that far off, but I think it's important that they not be expanded upon, nor that they be broader than necessary because the fact is we can't anticipate what this technology is going to do tomorrow. We cannot anticipate every iteration of this. And while there are certain specific uses or concerns that are being addressed by legislation, like the legislation you've referenced, there is a broader need for protection. The example I gave in my opening statement is one, Twigs has given examples as they applied to her. And so we do need to have that proper balance. And I am concerned that we are only looking at one side of the First Amendment consideration here.

The other side of the First Amendment consideration is the right that each of us has to our own freedom of speech to be able to communicate our ideas, to associate ourselves with ideas that we want to associate with, and not be associated with ideas we disagree with. And that is being really trampled on right now by this unfettered ability of people without a federal right to do things like the deepfake I experienced, that she experienced, et cetera. And so I do feel like the committee is going to have to work on defining these exceptions, making sure they are no broader than necessary to keep the legislation viable, but also to make sure it doesn't swallow up the rule. As the Chairman said, if we make them so broad that they swell up the rule, then all of this work will have been for naught. And the reality is today is not like 10 years ago; it's not like 30 years ago. This technology is fundamentally different, and what it can do with all of our faces and voices calls out; it screams out for a remedy that's actually effective.

Sen. Amy Klobuchar (D-MN):

And do you see, and maybe anyone, Twigs, any of you, Mr. Kyncl, want to get this need for a national standard just because Senator Blackburn worked with us on this bill and is going to be a co-sponsor and they just did the Elvis Act. Of course, in Minnesota, we have the Dylan Act and the Prince Act. No, I just made that up. But we do have people, as you know, who are fiercely, fiercely independent, and protective of their incredible music in our state, but we have a common law in Minnesota that's helpful. There's like this state; state talk about a few of you if you want to just talk about this need to have this national standard and why it's so important.

Robert Kyncl:

Maybe if I can.

Sen. Amy Klobuchar (D-MN):

Okay Mr. Kyncl.

Robert Kyncl:

I just want to comment on some of the things from before. As someone who grew up without the First Amendment, I value it probably more than those who have because I do not take it for granted at all. It seems like it's well and alive in America because half of the movies nominated for Oscars were based on existing folks.

So, saying that any AI regulation that is respectful of the existing First Amendment is not reducing it. It's keeping it as it is, and it's alive and well. So I do think that we need to stay within the limits of the First Amendment and not go beyond. As to national regulation, we work with global platforms; we're talking about global platforms, not even national ones. We're talking about global platforms. Doing anything state by state is a very cumbersome process to prevent content from getting on a platform unauthorized. If we have to fight that on a state by state, it's untenable. It just doesn't work.

Sen. Amy Klobuchar (D-MN):

Very good. Mr. Davies, that will be my last one, and then we'll go ahead.

Graham Davies:

Thank you. I just need to reinforce what Roberts just said. Absolutely right. Music streaming is global. The success of this is having access to Twigs's music from the UK or from Tennessee or wherever, so it's high volume. Anything that adds complexity on a state by state level is anathema to this industry. So, we are very strongly in favor of preemption.

Sen. Amy Klobuchar (D-MN):

Very good. Just the last thing kind of along those lines is don't laugh, I heard you just, it'll be very fast and you can put it in writing of you, Mr. Davies. In January, we heard testimony that generative AI has been used to create unauthorized digital replicas of news anchors making comments, and we have a number of things going on in the journalism area. I have a vested interest. My dad was a journalist for the Minneapolis Star Tribune, but also Senator Kennedy and I have the bill to push for negotiation of the content and to get them reimbursed mainly from Google and Facebook for the use of this content, something that's going on in Australia and Canada, and I will not go on, but what steps can streaming services take to ensure that unauthorized digital replicas of journalists are not posted on the streaming platform?

Graham Davies:

Senator, if I could follow up with you after. I am not briefed on that.

Sen. Amy Klobuchar (D-MN):

Okay. Excellent. Thank you.

Sen. Chris Coons (D-DE):

Thank you. Senator Klobuchar. Back to Senator Tillis for his second round. Twigs, if you would like to weigh in?

Tahliah Debrett Barnett (“FKA twigs”):

Oh, thank you. I'd actually like to go back to Mr. Sheffner's point about the desire to make very big and financially successful films about artists without consent. I think the problem is if you're able to use an artist's voice and likeness without consent about their life story, you're giving the impression that it's, I guess, the equivalent of an autobiography rather than a biography. And that's the confusion. If you're able to use my voice and my exact face, you're saying this is what happened from my point of view, and it's not. It's what happened from a team of writers in Hollywood that want to over-dramatize things and maybe make it more tragic or more fantastical. And I think that's what makes me really nervous and feel uncomfortable and very vulnerable. I don't think it's fair that even after an artist deceased, that somebody would be able to make a film about their life using them. We can watch a film about a person, a staffer in the past, and if it's an actor, we know to take it with a pinch of salt. If it is the person themself, then it just feels too unclear and not fair, and actually not in; what am I trying to say? Not the best intention of the artist's legacy.

Sen. Chris Coons (D-DE):

Thank you. Thank you, Senator.

Sen. Thom Tillis (R-NC):

Thank you. Mr. Chair. I'm going to be brief. I do have a question for you. Mr. Crabtree-Ireland, in the current draft legislation, individuals only have a right to license out their digital likenesses if they hire an attorney or they're a member of a labor organization. We've gotten some feedback that think that your organization, in particular, that this is a giveaway. Can you give me other examples? In law, giveaway are really vectoring everybody either into legal counsel or to your union. Can you give me examples, other areas and laws where this is the case where you have to engage an attorney or a labor interest to move forward?

Duncan Crabtree-Ireland:

Sure, and I guess I would just say I don't think it's just our union; it would be any collective bargaining representative, but there are a number of examples that occur in labor law, labor and employment law where there are defined worker protections that then can be deviated from through a collective bargaining arrangement, but not through individual contracts. In this case, the proposal I think, is a little broader, a little more open because of the option of securing individual representation by an attorney as an alternative that's not normally present in those kinds of statutes. But I'm sure I could provide, I can't give you a laundry list right now.

Sen. Thom Tillis (R-NC):

If you could, for the record that we're going to be submitting questions for the record for all of you, provide an opportunity for additional information. Mr. Chair, I just think it's remarkable. If you take a look at the attendance in the audience and the engagement from the members, you're, you're hard pressed to see, I mean, on certain subjects, but on technical subjects like this, to have members go twice or a lot of time demonstrates the interest of Twigs. I'm going to end my questions with you. I do believe that Congress needs to act, but you need to understand that this is, it's tough to get virtually anything done, even what appears to be common sense for the reasons that we've talked about. We're going to have constitutional questions we have to address, we have to get to a number of matters, and hopefully, we will get it done this year.

But you were in your opening statement, you were emotional or appeared to be emotional, one or two points. And I'm just trying to, I think that people need to understand, I think, excuse me. One of the reasons maybe you got emotional is because this is an existential threat to creators, and I'm trying to figure out how we educate people on the difference between an original creation from a human being and something that was either created or augmented from a machine. And this is more of a societal thing that we have to sort out. At what point is a society just prepared to say, boy, this sounds as good; I know it comes from a machine. You mentioned something about the investment of your fans that they've made in you. How do you invest in a relationship with a machine? I mean, we're at an interesting point in time in history where we could have billions of people think the inauthentic creation of a machine is somehow as good as the hard work of a human being. I wonder at what point when we lose all the creators, this philosophical question, at what point can those machines never possibly match the creative genius of an individual?

Tahliah Debrett Barnett (“FKA twigs”):

Thank you.

Sen. Thom Tillis (R-NC):

That's okay. Red means on, which makes no sense to me. But

Tahliah Debrett Barnett (“FKA twigs”):

I think that there are two things here. I feel incredibly lucky to have spent the whole of my teenage years without a smartphone. So I straddle a generation where I memorized all my friends' numbers. I would walk to my friend's house. If we said we were going to meet at one o'clock, I just would have to be there, you know that there was no texting and saying that I was going to be late. I loved my brain back then. I loved how simple it was. I loved what the truth was back then. I loved that. I was able to think for myself even where we're at with the internet now, it's so confusing. Even if you just want to find a simple news story, we can't; even if you want to find the truth about whether a food even is good for you or bad, we can't. It's just a stream of nonsense. I look at a lot of my friends who have children who are teenagers, and their mental health is really struggling. We're looking at young people who have anxiety that have depression because they're overwhelmed with information and lack of truth and lack of stability. And the thing that scares me is that my fans look at me, look to me for a north star, a message, a sense of being. My work is something that they can find themselves in. And if you change the narrative of my work, we're just messing with their brains.

The solid essence of my work that I've spent 10 years developing, if someone can just take it and make up something completely different, I'd feel so bad because I'm harming people then, and there would be nothing that I can do about it. I think the way that we can prevent this from happening is by putting the power in the hands of the artists and also putting the power in the hands of people who are there to protect the artists, whether that's third parties like record labels or agents or lawyers. That's up to the artist to understand and to sign a contract if we want to. But I think that the way that I've been experimenting with deepfake, it's going to help my fans. It's going to help them understand the nuance of my language across all parts of the world. The way that I want to use it is not harmful because I think inherently, artists just want to express their emotions, be there for people, and say things that you can't say for yourself. So if you're putting words in our mouths, it's going to be devastating.

Sen. Thom Tillis (R-NC):

Why? Well, I also agree. I'm very glad there weren't cell phones back when I was a young person, but maybe for other reasons and that Polaroids faded. But no, I do think that I'm glad that we're taking up this bill. I do feel strongly that we should do everything we can to try and move it in this Congress. If not, then we just have to lean into it and get it done in the near future. But when we have these discussions, it points to all the other societal challenges of creators that we need to get right. This technology. I love it. I interact with generative AI for about an hour every day as a part of my own study of it, and a study that began back in the 1980s in artificial intelligence for me personally. But we've got a lot of work to do, and Congress has a role to play, but we've got to be very, very careful not to overstep, not to trample the rights of others, and we're going to need your help and your continued engagement to get it right. So thank you all for being here today.

Sen. Chris Coons (D-DE):

Senator Tillis, thank you. Thank you for again being a great partner. I have even more questions, but we have come to the end of our time, and you and Senator Blackburn have been terrific to work with. I am grateful to all of our witnesses for the way that you've brought your skills, your value, your background, your creativity, and your voice to this hearing today, and we've engaged in a lot of different challenging questions about how we could refine this, how we could narrow it. There've been a lot of members who've participated. For those who did not participate or those who still have other questions, these rapidly developing AI tools reinforce what we've heard today about why we need a clear policy to protect the record will be open for questions for the record, for the witnesses. They're due one week from today by 5:00 PM on May 7th, although Twigs, in your case, two weeks before we wrap this up with cellophane and move forward, if I could, today's hearing was important to show that when we regulate the use of AI, we have to balance individual privacy rights and First Amendment rights in a way that doesn't stifle creativity and innovation.

With these rapidly developing AI tools, it reinforces what we've heard today, why we need a clear policy to protect the image, voice, and likeness of all individuals from unauthorized AI replicas. The feedback we heard and that our staff has received over the last six months is critical. I look forward to working with my colleagues and co-sponsors and the witnesses and the others who attended today to refine this in the next week or two and get to the point where we can introduce it next month. So we move from discussion draft to reality. I think we need to seize the moment, move forward. Thank you for your partnership. Thank you for your testimony with that. This hearing is adjourned.

Authors

Prithvi Iyer
Prithvi Iyer is a Program Manager at Tech Policy Press. He completed a masters of Global Affairs from the University of Notre Dame where he also served as Assistant Director of the Peacetech and Polarization Lab. Prior to his graduate studies, he worked as a research assistant for the Observer Resea...

Topics