A bipartisan group of senators has introduced a new bill aimed at protecting artists, songwriters, and journalists from having their content used to train AI models or generate AI content without their consent. Called the Content Provenance and Integrity from Editorial and Deepfake Media Act (COPIED Act), the bill also aims to make it easier to identify AI-generated content and combat the rise of harmful deepfakes.
The bill was authored by Senate Commerce Committee Chairman Maria Cantwell (D-Wash.), Senate Artificial Intelligence Working Group Member Martin Heinrich (D-New Mexico), and Commerce Committee Member Marsha Blackburn (R-Tenn.).
The bill would require companies developing AI tools to allow users to attach content provenance within two years. Content provenance is machine-readable information that records the origin of digital content such as a photo or news article. According to the bill, works that include content provenance cannot be used to train AI models or generate AI content.
The bill aims to give content owners, including journalists, newspapers, artists and songwriters, the ability to protect their work while setting terms for content usage, including compensation, and the right to sue platforms that use content without permission or falsify the source of the content.
The COPIED Act requires the National Institute of Standards and Technology (NIST) to develop guidelines and standards for content provenance, watermarking, and synthetic content detection.
These standards are used to determine whether content has been generated or modified by AI, and where the AI content originates.
“The bipartisan COPIED Act, which I introduced with Senators Blackburn and Heinrich, provides much-needed transparency regarding AI-generated content,” Senator Cantwell said in a press release. “The COPIED Act also gives creators, including local journalists, artists, and musicians, back control of their content through a provenance and watermarking process, which I believe is sorely needed.”
The bill is supported by several artists' groups, including SAG-AFTRA, the National Association of Music Publishers, the Seattle Times, the National Songwriters Guild and the Artists' Rights Alliance.
The introduction of the COPIED Act comes amid a flurry of AI-related bills being introduced as lawmakers seek to regulate AI technology.
Last month, Senator Ted Cruz introduced a bill that would hold social media companies like X and Instagram responsible for removing and monitoring deepfake porn. The bill comes amid a surge in AI-generated pornographic photos of celebrities, including Taylor Swift, circulating on social media.
In May, Senate Majority Leader Chuck Schumer released an AI “roadmap” that included increasing funding for AI innovation, addressing the use of deepfakes in elections, and using AI to strengthen national security.
Additionally, Axios reported earlier this year that state legislatures were introducing 50 AI-related bills every week. The report noted that a total of 407 AI-related bills had been introduced in more than 40 states as of February, up from 67 AI-related bills introduced a year ago.
With the emergence and widespread use of AI tools, President Joe Biden issued an executive order last October setting standards for AI safety and security. The standards require developers of AI systems to share safety test results and other important information with the government before releasing the systems to the public. It is worth noting that former President Donald Trump has vowed to revoke this executive order if re-elected.