WATCH KLOBUCHAR’S FULL REMARKS AND QUESTION HERE

WASHINGTON - U.S. Senator Amy Klobuchar (D-MN), Ranking Member of the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, held a hearing titled “The Good, the Bad, and the Ugly: AI-Generated Deepfakes in 2025.” 

Testifying at the hearing was Country Music Singer-Songwriter, Martina McBride; CEO of the Recording Industry Association of America, Mitch Glazier; Senior Legal Counsel at the National Center on Sexual Exploitation (NCOSE), Christen Price; Director of Technology Policy at Consumer Reports, Justin Brookman; and Head of Music Policy at Youtube, Suzana Carlos.

“AI-enabled scams have become far too common. We know that it takes only a few seconds of audio to clone a voice. Criminals can pull the audio sample and personal back story from public sources, said Klobuchar at the hearing. “We also need rules of the road to ensure that AI technologies empower artists and creators and not undermine them. Art just doesn't entertain us. It's something that uplifts us and brings us together.”

“That’s why this NO FAKES Act is so important. It protects people from having their voice and likeness replicated using AI without their permission, all within the framework of the Constitution, and it protects everybody, because everyone should have a right to privacy.” 

A rough transcript of Klobuchar’s opening remarks and questions is available below. Video is available HERE.

Senator Klobuchar: Thank you very much, Senator Blackburn, I'm very excited about this subcommittee and the work we've already done together for years on this issue and similar issues when it comes to tech.

I share your hopes for AI and see that we're on this cusp of amazing advancements if this is harnessed in the right way, but I'm also concerned if things go the wrong way. I think it was David Brooks, a columnist, that said he has trouble writing about it because he doesn't know if it will take us to Heaven or Hell. So it's our job to head to heaven, and it's our job to put some rules in place, and this is certainly one of them. 

We want this to work for children, for consumers, for artists, and not against them. And you brought up the example Chair, of Randy Travis who was at the event that we recently had with you, and Senator Coons and myself about the bill and how he used AI in such a positive way. But then we know there are these risks. 

And one of the things that I think is really exciting about this week is that, in fact, on Monday, the President signed my bill with Senator Cruz, the TAKE IT DOWN Act, into law. This was a bill I discussed with him and the First Lady at the inaugural lunch. 

It's an example of “use-every-moment-you-have” to advance a cause. And then she supported the bill and helped to get it passed in the House. Senator Cruz and I had already passed it in the Senate, and we were having some trouble getting it done over in the House. So we're really pleased, because it actually does set some track moving forward, even though this bill, that bill, is about nonconsensual porn, both AI created and non AI created, it's had huge harmful effects, about 20 some suicides a year of young kids who think they're sending a picture innocently to a girlfriend or a potential boyfriend, and then it gets sent out on their school internet. It gets sent out to people they know, and basically, they believe their life is in ruins, and don't have any other context, and take their own lives. And that's just the most obvious and frightful part of this, but there's others as well. So I'm hoping this is going to be a first step to some of the work that we can do, including with the bill that we're going to be discussing today. 

AI-enabled scams have become far too common. We know that it takes only a few seconds of audio to clone a voice. Criminals can pull the audio sample and personal back story from public sources. 

Just last week, the FBI was forced to put out an alert about scams using AI-cloned voices of FBI agents and officials asking people for sensitive payment information.

Jamie Lee Curtis was forced to make a public appeal to Mark Zuckerberg to take down an unauthorized, deepfake ad that included her digital replica endorsing a dental product. While Meta removed the ad after her direct outreach, most people don't have that kind of influence. 

We also need rules of the road to ensure that AI technologies empower artists and creators and not undermine them. Art just doesn't entertain us. It's something that uplifts us and brings us together. 

When I recently met with Cory Wong, a Grammy-nominated artist from Minnesota, he talked about how unauthorized digital replicas threaten artists’ livelihoods and undermine their ability to create art. 

So this is not just a personal issue. It's also an economic issue. One of the reasons our country, one of our best exports to the world, is music and movies. When you look at the numbers and how we've been able to captivate people around the world, that's going to go away if people can just copy everything that we do. 

And one of the keys to our success as a nation in innovation has been the fact, and Senator Coons does a lot of work in this area, [that] we've been able to respect copyrights and patents and people’s own right to their own products. 

So that's why this NO FAKES Act is so important. It protects people from having their voice and likeness replicated using AI without their permission, all within the framework of the Constitution, and it protects everybody, because everyone should have a right to privacy. 

I also am working in the space on AI to put some base rules in place in my role on the Commerce Committee. Senator Thune and I have a bill that we're reintroducing on this to set some rules for NIST to be able to put out there for companies that are using AI. And then I'm always concerned about its effect on democracy, but that is for a different day and in a different committee. 

But I do want to thank Senator Blackburn for her willingness to come out on doing something about tech, including the work she does with Senator Blumenthal, the work that we've done together on commerce. And if Monday is any sign with the first bill getting through and there in that Rose Garden signing ceremony, there's more to come, and so thank you and look forward to hearing from the witnesses.

Klobuchar: All right. Thank you very much. I guess I'll start with Mr. Brookman, the non-Grammy winner. I want to talk to you just a little bit about this consumer angle here, which I think is interesting to people. And I think at its core, all of us involved in this legislation have made it really clear that's not just people who are well known that will be hurt by this eventually, and that getting this bill passed as soon as possible is just as important for everyone, but I do so appreciate Ms. McBride being willing to come forward, because those stories and the stories that we've heard from, like I mentioned, Jamie Lee Curtis, or the stories that we've heard from many celebrities, are very important to getting this done. So you just did a report on AI-generated voice cloning scams, including that, AI voice cloning applications, in the words of the report, presents a clear opportunity for scammers, and we need to make sure our consumer protection enforcers are prepared to respond to the growing threat of these scams. I had this happen to my state director's husband, who their kid is in the Marines, and they got a call. They figured out that it wasn't really him asking for stuff and money. They knew he couldn't call from where he was deployed to. This is just going to be happening all over the place, and the next call will be to a grandma who thinks it's real, and she sends her life savings in. So I have called on the FTC and the FCC to step up their efforts to prevent these voice cloning scams. And what are some of the tools that agencies need to crack down on these scams, even outside of this bill?

Justin Brookman: Yeah, absolutely, so I think the first thing the Federal Trade Commission probably needed is more resources. They only have like 1200 people right now for the entire economy. That's down from like seven, that's down from like 100 just in the past couple of months.

Klobuchar: Down from way down from even during like, the Nixon Era.

Brookman: Yeah, like 1700 it used to be and the economy has grown like three or four times. Chairman Ferguson has, Chairman Ferguson has said more cuts are coming, which I think is the wrong direction. I worked for the Federal Trade Commission for a couple of years. We could not do, like, a fraction of all the things that we wanted to do to protect consumers, so more people, more capacity, more technologists. Like, there's just not enough technology capacity in government. I was in the office of technology research and investigation there, that was like five people. That's just not enough, obviously, with all these very sophisticated, I mean, just deep fakes alone, let alone the rest of the tech economy, the ability to get penalties and even injunctive relief, right if someone, if someone gets caught stealing something, the FTC often doesn't have the ability to make them give the money back. I know this, under this committee has tried to restore that authority, but that would be important. And also, like again, maybe the FTC could have rule-making authority. But also this, I would like to see Congress consider legislative authority to address tools like again, if you are offering a tool that can be used only for harm, voice impersonation, deepfake pornographic images, maybe there should, there should be responsibilities to make sure it's not being used for harm.

Klobuchar: Okay, thank you. Ms. Carlos, can you talk about what YouTube is doing to ensure it's not facilitating these scams?

Suzana Carlos: Sure, and thank you for the question, Senator.

Klobuchar: And thanks for your support for the bill

Carlos: Of course. So, just to primarily consider, we obviously see great and tremendous opportunity coming from AI, but we also acknowledge that there are risks, and it is our utmost responsibility to ensure that it is deployed responsibly. So we've taken a number of efforts to protect against unharmful contact on our platform. Primarily, we have uploaded, we have updated our privacy policies last year to ensure that all individuals can now submit a notice to YouTube when their unauthorized voice or likeness has been used on our platform, and once reviewed, if it is applicable, and we've confirmed that that content should be removed, we will take it down. We've additionally implemented watermarks on our AI products. We originally began with both image and watermarks using our SynthID technology, and we've recently expanded it to also be applied to text generated from our Gemini app and web experience. And most recently, as part of our VO video tool. We've also taken the additional step to become a member of C2PA, the Coalition for Content Provenance and Authenticity, and there, we're serving as a steering member to work with the organization to create indicators and markings that will allow the content provenance that was created off platforms to additionally be recognized, and we're deploying those technologies across our platform.

Klobuchar: Okay, thank you. We mentioned the TAKE IT DOWN Act, and thank you for the support for that. Mr. Glazer, you talked about how this is the first federal law related to generative AI, and that it's a good first step. And could you talk about how, if we don't move on from there and we just stop and don't do anything for years, which seems to be what's been going on, what's going to happen here, and why it's so important to do this.

Mitch Glazier: I think there's a very small window, and an unusual window, for Congress to get ahead of what is happening before it becomes irreparable. The TAKE IT DOWN Act was an incredible model. It was done for criminal activity, you know, …

Klobuchar: Yeah, I know. 

Glazier:  Yeah, right. You know, you wrote it, but it was a great model, but it only goes so far. But we need to use that model now, and we need to expand it carefully in a balanced way to lots of other situations, which is exactly what the NO FAKES Act does. And I think, you know, we have a very limited amount of time in order to allow people and platforms to act before this gets to a point where it's so far out of the barn that instead of encouraging responsible AI development, instead, we allow investment and capital to go into AI development that hurts…

Klobuchar: Stealing things…

Glazier: So let's encourage investment the right way to boost great AI development and be first. Let's not be the folks that encourage investment in AI technologies that really harm us.

Klobuchar: And Ms. Price, you've expressed concerns about this 10-year moratorium on state rules. I'm very concerned, having spent years trying to pass some of these things, and I think that one of the ways we pass things quickly, like Mr. Glazier was talking about, is if people actually see a reason that they don't want to patch work, they want to get it done. But if you just put a moratorium, and you look at, like, the Elvis law coming out of Tennessee, Ms. McBride, and some of the other things that would stop all of that. Could you, my last question here before we go to another round, could you talk about why you're concerned about what is right in front of us now, which is this 10-year moratorium?

Christen Price: Yes, thank you for the question, Senator. We're concerned about the moratorium because it's basically signaling to the AI companies that they can kind of do whatever they want in the meantime, and it inhibits States' ability to adapt their laws to this form of technology that's changing very quickly and then has this potential to cause great harm. 

Klobuchar: Thank you.

###