Marsha Blackburn

04/22/2026 | Press release | Archived content

Blackburn, Welch Hold Roundtable with Over 20 Artists Who Support NO FAKES Act and TRAIN Act to Protect Creators from AI Harms

Blackburn, Welch Hold Roundtable with Over 20 Artists Who Support NO FAKES Act and TRAIN Act to Protect Creators from AI Harms

April 22, 2026

WASHINGTON, D.C. - Today, U.S. Senators Marsha Blackburn (R-Tenn.) and Peter Welch (D-Vt.) held a roundtable with more than 20 artists who are in D.C. to advocate for the Senators' bipartisan NO FAKES Act and TRAIN Act during the Recording Academy's "GRAMMYs on the Hill Advocacy Day." These bills would protect creators from harmful deepfakes and empower artists to access the courts to protect their copyrighted works when they are used to train generative AI models.

Click here to download photos from the roundtable.

"Our Constitution-specifically, Article I, Section 8, Clause 8-gives all creators in our country the guaranteed right to benefit from their works, but AI is increasingly challenging this right that our creative community relies on to make a living," said Senator Blackburn. "The NO FAKESAct would address the harms deepfakes pose for creators, and the TRAIN Act would empower creators to protect their copyrighted works when they are used to train generative AI models. It was an honor to hear from singers, songwriters, and entertainers who recognize the value of this critical legislation and are pushing Congress to get these bills across the finish line."

"The arts help us find a way to come together at a time where there's just so much conflict and division. The voices, words, heart, and soul that live in the music artists create is astonishing-that's deeply human, and can't be replicated by AI," said Senator Welch. "The TRAIN Act is an incredibly important, bipartisan bill to stand up to AI and give power back to creators. I'm glad to champion this bill alongside Senator Blackburn to protect and stand up for the contribution that each and every artist makes that is so essential to the well-being, emotional health, and soul of everyone in this country."

NO FAKES ACT

With the rapid advance of generative AI, artists and creators have already begun to see their voices and likenesses used without their consent in videos and songs created as nearly indistinguishable replicas. The NO FAKES Act would address the use of non-consensual digital replications in audiovisual works or sound recordings by:

  • Holding individuals or companies liable if they distribute an unauthorized digital replica of an individual's voice or visual likeness;
  • Holding platforms liable for hosting an unauthorized digital replica if the platform has knowledge of the fact that the replica was not authorized by the individual depicted;
  • Excluding certain digital replicas from coverage based on recognized First Amendment protections; and
  • Preempting future state laws regulating digital replicas.

Click here to read the bill text.

TRAIN ACT

Currently, there is no reliable way for copyright owners to determine if AI companies used their works without permission to train AI models. Copyright owners-particularly small creators-are struggling to navigate novel legal issues posed by AI copying their work. There are very few AI companies that share how their models were trained and nothing in current law requires them to disclose training materials to creators.

The TRAIN Act would promote transparency about when and how copyrighted works are used to train generative AI models by enabling copyright holders to obtain this information through an administrative subpoena. Modeled on the process used for matters of internet piracy, the bill would provide access to the courts for copyright holders with a good faith belief that their copyrighted material was used. Only training material with their copyrighted works need be made available.

The bill would also ensure that subpoenas are granted only upon a copyright owner's sworn declaration that they have a good faith belief their work was used to train the model, and that their purpose is to protect their rights. Failure to comply with a subpoena creates a rebuttable presumption that the model developer made copies of the copyrighted work.


Click here to read the bill text.

RELATED

Marsha Blackburn published this content on April 22, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on April 24, 2026 at 18:15 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]