Jul 15, 2024 3 min read

US Senators propose COPIED Act to rein in AI's “theft” of creative content

New US Senate bill aims to further curb AI’s unchecked use of copyright material, with proposals for AI content watermarks and protections for creators. RIAA boss Mitch Glazier says that “leading tech companies refuse to share basic data” and the COPIED Act “would grant much needed visibility”

US Senators propose COPIED Act to rein in AI's “theft” of creative content

US senators have proposed yet more new legislation to regulate generative AI, with music industry trade organisations welcoming the move. The COPIED Act would “set new federal transparency guidelines for marking, authenticating and detecting AI-generated content” and “protect journalists, actors and artists against AI-driven theft”. 

“Artificial intelligence has given bad actors the ability to create deepfakes of every individual, including those in the creative community, to imitate their likeness without their consent and profit off of counterfeit content”, says Marsha Blackburn, one of the senators sponsoring the bill. “The COPIED Act takes an important step to better defend common targets like artists and performers against deepfakes and other inauthentic content”. 

Mitch Glazier, CEO of the Recording Industry Association Of America, voiced his support for the bill saying, “Leading tech companies refuse to share basic data about the creation and training of their models as they profit from copying and using unlicensed copyrighted material to generate synthetic recordings that unfairly compete with original works”. He added that the COPIED Act “would grant much needed visibility into AI development and pave the way for more ethical innovation and fair and transparent competition in the digital marketplace”. 

The proposed legislation would require the US National Institute Of Standards And Technology to develop new standards, and a watermarking system, to easily identify AI-generated or AI-manipulated content. 

Additionally, it would establish standards to allow creators and journalists to attach ‘provenance information’ to their content online, as well as prohibiting “the unauthorised use of content with provenance information to train AI models or generate AI content”. 

Enforcement of these new rules would fall to the Federal Trade Commission and state attorneys general, though individual creators and content owners could also sue AI companies which they believed had used their content without permission in violation of the new act.

The COPIED Act is not alone in proposing AI regulation relating to copyright and creators’ rights. Other recent proposals in US Congress include the No AI FRAUD Act in the House Of Representatives and NO FAKES Act in the Senate, both of which focus on protecting individuals’ voice and likeness - including those of creators and performers - from unauthorised use in AI content. 

Meanwhile, the Generative AI Copyright Disclosure Act would force AI companies to declare what data has been used when training their AI models. This would make it easier for copyright owners to see if their content has been used to train an AI model and, if it has been used without permission, to sue for copyright infringement. 

There is still a debate over whether or not AI companies need permission from copyright owners to use existing content. Many in the tech sector argue that AI training constitutes fair use under American copyright law, meaning no permission is required. The copyright industries do not agree and have filed lawsuits, including the RIAA-led lawsuits against Suno and Udio. As a result, the fair use defence will ultimately be debated - and tested - in court.

If passed, the COPIED Act would provide additional protection for copyright owners. Even if AI training was sometimes deemed to be fair use, the restriction on using content with provenance information would mean AI companies would still be required to obtain permission from creators and copyright owners. 

This would mean that a copyright owner could take practical steps to ensure its works could not be used without permission by relying on the fair use defence, by attaching ‘provenance information’ to their content. This would be similar to the way rights owners in the European Union can opt out of the specific data mining exception AI companies can otherwise arguably rely on. 

Other music industry organisations backing the proposals include Nashville Songwriters Association International, Recording Academy, National Music Publishers’ Association, Artist Rights Alliance, The Society Of Composers & Lyricists, Songwriters Guild Of America, and Music Creators North America. 

Performer union SAG-AFTRA and the music industry instigated Human Artistry Campaign are also backers, alongside various organisations representing news and media companies, including News/Media Alliance, National Newspaper Association, America’s Newspapers, Rebuild Local News and the National Association Of Broadcasters.

Great! You’ve successfully signed up.
Welcome back! You've successfully signed in.
You've successfully subscribed to CMU.
Your link has expired.
Success! Check your email for magic link to sign-in.
Success! Your billing info has been updated.
Your billing was not updated.
Privacy Policy