A bipartisan group of senators has introduced the AI COPIED Act to enhance the authentication and detection of AI-generated content.
The legislation aims to safeguard the intellectual property of journalists and artists from being exploited by AI models without their consent.
What does the AI COPIED Act suggest?
The Content Origin Protection and Integrity from Edited and Deepfaked Media Act (AI COPIED Act) mandates the National Institute of Standards and Technology (NIST) to develop standards and guidelines for verifying content origins and identifying synthetic content through methods like watermarking. Additionally, it requires AI tools for creating journalistic or creative content to include and protect original information.
Under the AI COPIED Act, content creators such as broadcasters, artists, and journalists can take legal action against entities that misuse their materials or tamper with authentication markers. Enforcement can also be carried out by state attorneys general and the Federal Trade Commission (FTC), ensuring robust protection against unauthorized use and manipulation of content provenance information.
Content Origin Protection and Integrity from Edited and Deepfaked Media Act (AI COPIED Act) has underscored the issues and their potential remedies like this:
AI COPIED Act’s Definitions
- Artificial intelligence: Ensure the definition aligns with the latest advancements and includes machine learning and neural networks explicitly.
- Content provenance information: Clarify what constitutes “state-of-the-art” and specify the types of metadata included.
- Covered content: Expand the definition to include digital representations beyond the U.S. Code if necessary.
- Deepfake: Include examples of deepfake content to illustrate its application.
- Synthetic content: Specify examples and clarify the distinction between fully generated and significantly modified content.
- Watermarking: Define technical standards for embedding information and examples of difficult-to-remove techniques.
AI COPIED Act’s establishment of standards
- Section 4(a)(1):
- Detail the process and criteria for developing guidelines, including stakeholder involvement (e.g., industry experts, consumer advocates).
- Specify periodic reviews and updates to standards to keep pace with technological advancements.
- Section 4(a)(2):
- Define metrics for evaluating detection tools, such as accuracy, false positives, and false negatives.
- Include provisions for independent audits of detection tools.
- Section 4(a)(3):
- Detail the scope and objectives of grand challenges and prizes, and outline funding mechanisms.
AI COPIED Act’s Research and public education suggestions
- Section 5(a):
- Include specific research areas such as the impact of synthetic content on society and the economy.
- Propose collaboration with academic institutions and international organizations.
- Section 5(b):
- Outline the goals and strategies for public education campaigns, including target audiences and methods (e.g., social media, workshops).
- Propose partnerships with educational institutions and non-profits.
AI COPIED Act’s requirements and prohibited acts
- Section 6(a)(1):
- Define “commercial purpose” more precisely to include specific activities and industries.
- Detail security measures, including encryption standards and periodic security assessments.
- Section 6(a)(2):
- Specify the types of “covered content” tools and their applications (e.g., image editing software, audio synthesis tools).
- Section 6(b):
- Define the term “unfair or deceptive act or practice” and provide examples.
- Detail the exception for security research, including requirements for documentation and reporting.
- Section 6(c):
- Clarify “express, informed consent” and the process for obtaining and documenting it.
- Include provisions for data protection and privacy in consent agreements.
AI COPIED Act’s enforcement
- Section 7(a):
- Detail the Federal Trade Commission’s (FTC) process for investigating and enforcing violations.
- Include provisions for coordination with other federal and state agencies.
- Section 7(b):
- Define the process for state attorneys general to bring civil actions, including notification requirements and timelines.
- Specify the types of relief and penalties that can be sought.
- Section 7(c):
- Outline the process for private parties and government entities to bring civil actions, including evidence requirements and burden of proof.
- Define the statute of limitations more clearly, including when the clock starts for discovering violations.
AI COPIED Act’s rule of construction and severability
- Section 8:
- Clarify the relationship with other applicable laws, especially those related to copyright and intellectual property.
- Include examples of potential conflicts and resolutions.
- Section 9:
- Detail the process for determining the enforceability or validity of provisions.
- Propose a mechanism for addressing invalid provisions without disrupting the overall act.
AI COPIED Act’s additional provisions
- Transparency and accountability:
- Include provisions for transparency in AI development and deployment, such as public reporting of AI system capabilities and limitations.
- Detail accountability measures for violations, including penalties and corrective actions.
- International collaboration:
- Propose collaboration with international bodies to develop global standards and best practices.
- Include provisions for recognizing and enforcing foreign judgments related to content provenance and synthetic content.
The whole industry supports it
Several industry groups, including SAG-AFTRA and the Recording Industry Association of America, have praised the bill. They recognize the significant threat posed by AI-generated content to the economic and reputational well-being of performers and other content creators. The COPIED Act addresses these concerns by promoting a transparent and accountable supply chain for generative AI content, ensuring that creators retain control over the use of their likenesses and work.
Can the Content Origin Protection and Integrity from Edited and Deepfaked Media Act bring regulation?
The introduction of the COPIED Act is part of a broader legislative effort to regulate AI technology. Led by Senate Commerce Committee Chair Maria Cantwell and supported by other key senators, this bill represents a crucial step in developing a comprehensive framework to manage the challenges posed by AI.
By establishing standards and enforcing regulations, the AI COPIED Act aims to foster innovation while protecting the rights of content creators but can AI be regulated or have we already gone too far? You be the judge…
Featured image credit: pikisuperstar/Freepik