Blackburn, Welch Hold Roundtable with Over 20 Artists Who Support NO FAKES Act and TRAIN Act to Protect Creators from AI Harms
Blackburn, Welch Hold Roundtable with Over 20 Artists Who Support NO FAKES Act and TRAIN Act to Protect Creators from AI Harms
April 22, 2026
WASHINGTON, D.C.
– Today, U.S. Senators
Marsha Blackburn
(R-Tenn.) and
Peter Welch
(D-Vt.) held a roundtable with more than 20 artists who are in D.C. to advocate for the Senators’ bipartisan
NO FAKES Act
and
TRAIN Act
during the Recording Academy’s “GRAMMYs on the Hill Advocacy Day.” These bills would protect creators from harmful deepfakes and empower artists to access the courts to protect their copyrighted works when they are used to train generative AI models.
Click
here
to download photos from the roundtable.
“Our Constitution—specifically, Article I, Section 8, Clause 8—gives all creators in our country the guaranteed right to benefit from their works, but AI is increasingly challenging this right that our creative community relies on to make a living,”
said Senator Blackburn
. “The NO FAKESAct would address the harms deepfakes pose for creators, and the TRAIN Act would empower creators to protect their copyrighted works when they are used to train generative AI models. It was an honor to hear from singers, songwriters, and entertainers who recognize the value of this critical legislation and are pushing Congress to get these bills across the finish line.”
“The arts help us find a way to come together at a time where there’s just so much conflict and division. The voices, words, heart, and soul that live in the music artists create is astonishing—that’s deeply human, and can’t be replicated by AI,”
said Senator Welch
. “The TRAIN Act is an incredibly important, bipartisan bill to stand up to AI and give power back to creators. I'm glad to champion this bill alongside Senator Blackburn to protect and stand up for the contribution that each and every artist makes that is so essential to the well-being, emotional health, and soul of everyone in this country.”
NO
FAKES
ACT
With the rapid advance of generative AI, artists and creators have already begun to see their voices and likenesses used without their consent in videos and songs created as nearly indistinguishable replicas. The
NO FAKES Act
would address the use of non-consensual digital replications in audiovisual works or sound recordings by:
Holding individuals or companies liable if they distribute an unauthorized digital replica of an individual’s voice or visual likeness;
Holding platforms liable for hosting an unauthorized digital replica if the platform has knowledge of the fact that the replica was not authorized by the individual depicted;
Excluding certain digital replicas from coverage based on recognized First Amendment protections; and
Preempting future state laws regulating digital replicas.
Click
here
to read the bill text.
TRAIN ACT
Currently, there is no reliable way for copyright owners to determine if AI companies used their works without permission to train AI models. Copyright owners—particularly small creators—are struggling to navigate novel legal issues posed by AI copying their work. There are very few AI companies that share how their models were trained and nothing in current law requires them to disclose training materials to creators.
The
T
RAIN Ac
t
would promote transparency about when and how copyrighted works are used to train generative AI models by enabling copyright holders to obtain this information through an administrative subpoena. Modeled on the process used for matters of internet piracy, the bill would provide access to the courts for copyright holders with a good faith belief that their copyrighted material was used. Only training material with their copyrighted works need be made available.
The bill would also ensure that subpoenas are granted only upon a copyright owner’s sworn declaration that they have a good faith belief their work was used to train the model, and that their purpose is to protect their rights. Failure to comply with a subpoena creates a rebuttable presumption that the model developer made copies of the copyrighted work.
Click
here
to read the bill text.
RELATED
Blackburn, Coons, Salazar, Dean, Colleagues Introduce “NO FAKES Act” to Protect Individuals and Creators from Digital Replicas
Welch Leads Bipartisan Bill to Protect Musicians, Artists, and Creators from Unauthorized AI Training
https://www.blackburn.senate.gov/2026/4/ai/blackburn-welch-hold-roundtable-with-over-20-artists-who-support-no-fakes-act-and-train-act-to-protect-creators-from-ai-harms
72d8512b-a62a-4bb4-a48a-206efac6288fIssued within 24 hours
Other senators' releases published in the day before or after this one.