A bipartisan coalition of U.S. senators has put forward a proposal aimed at safeguarding artists, musicians, and journalists from unauthorized artificial intelligence (AI) replication of their work. The bill, known as the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), seeks to prevent AI systems from using creative material without permission from the original creators.
Mandates for AI Tool Developers
The COPIED Act requires that developers of AI applications integrate a two-year timeline for allowing users to embed provenance information into their digital assets. This metadata would indicate the origin of pieces like images and written works, ensuring that content with this information isn’t used in training AI models.
Leading the charge on the bill are Senate Commerce Committee Chair Maria Cantwell (D-WA), Senate AI Working Group’s Martin Heinrich (D-NM), and Commerce Committee’s Marsha Blackburn (R-TN). They aim to give content creators legal recourse against platforms that use their work without authorization, allowing them to seek compensation and enforce the protection of their work.
Standards and Guidelines
The bill assigns the National Institute of Standards and Technology (NIST) the task of creating standards for provenance and watermarking of digital content and identifying synthetic media. These guidelines will assist in determining whether a piece of content was generated or altered by AI, and trace back its origin.
Several artist organizations, including SAG-AFTRA, the National Music Publishers’ Association, The Seattle Times, the Songwriters Guild of America, and the Artist Rights Alliance, have endorsed the bill. Senator Cantwell highlighted that the initiative would increase transparency around AI-generated content and return control to the creators.
Wider Legislative Context
The COPIED Act’s introduction fits into a broader legislative push to govern AI technologies. Recently, Senator Ted Cruz proposed a bill to hold social media firms accountable for managing deepfake pornography. Concurrently, Senate Majority Leader Chuck Schumer suggested a strategy to deal with AI concerns, including funding innovation and tackling deepfake issues in elections.
The COPIED Act also outlaws the unapproved use of content for AI training, empowering creators with rights to control and seek compensation. The act will be enforced by the U.S. Federal Trade Commission (FTC) and state attorneys general. Last year, the FTC asserted its authority to regulate AI, warning that generative AI could facilitate scams and emphasizing consumer protection through AI regulation.
Corporate and Legal Reactions
Generative AI developers are facing ongoing legal challenges from media companies and entertainment industry figures over copyright infringement. Recently, Cloudflare introduced a feature to block AI bots from scraping content, addressing concerns from website owners. According to the company, AI bots accessed 39% of the top sites using Cloudflare, with only 2.98% implementing blocks.
Last Updated on November 7, 2024 3:36 pm CET