DiMA Letter to House Judiciary Committee: Hearing on AI and Intellectual Property - 2024

February 1, 2024

The Honorable Darrell Issa
Chairman, Subcommittee on Courts, Intellectual Property, and the Internet
House Judiciary Committee

The Honorable Henry C. “Hank” Johnson
Ranking Member, Subcommittee on Courts, Intellectual Property, and the Internet
House Judiciary Committee

Dear Chairman Issa and Ranking Member Johnson,

The Digital Media Association (DiMA) appreciates the opportunity to share our perspective on issues surrounding personhood rights in the context of your upcoming hearing, “Artificial Intelligence and Intellectual Property: Part II – Identity in the Age of AI.” This is a critically important topic, and we appreciate your continued interest and engagement as policymakers, industry, and individuals all seek to navigate this rapidly changing landscape.

DiMA and our Members

DiMA represents the world’s leading audio streaming companies, whose investments in innovation drive the economic engine that have revitalized the music industry for the benefit of creators, rightsholders, consumers, and the economy. DiMA and its members – Amazon, Apple Music, Feed.fm, Pandora, Spotify, and YouTube – advocate for policies that ensure the continued success of the streaming economy where music fans have legal access to music anytime, anywhere they want it, and artists and songwriters can connect with old fans and make new ones around the world.

AI and the Music Industry
Questions about the use and impact of AI technology, its applications and how they intersect with existing law are an important area of focus for all music industry stakeholders, including DiMA and our member companies. AI has been used as a tool in the music industry for many years, and as the technology continues to rapidly evolve, it has the ability to assist creators and artists, including musicians, producers, and songwriters and improve the way music is created, distributed, discovered and consumed.

AI and Personhood
We appreciate this hearing’s focus on the important topic of “Identity in the Age of AI,” and look forward to working with the Committee and industry stakeholders as conversations continue around the issues raised by AI-generated replicas of individuals’ name, image, likeness, or voice. DiMA believes there should be appropriate safeguards to protect an individual’s personhood, and is committed to working toward solutions that ensure such protections in the age of AI. At the same time, we urge the Committee to proceed with caution so as not to inadvertently disrupt the AI technologies that are already being deployed successfully throughout the industry, or the balance with creativity and protected speech that any policy in this area must seek to strike.

The case for a Federal Solution
Fortunately, we don’t start with a blank slate in considering how to protect individuals’ right to the use of their name, image, likeness, or voice in the AI age. There are numerous state laws relating to privacy and to right of publicity already on the books, not to mention extensive common law. These existing laws can provide a helpful starting point in some cases, including test-cases of what works and what does not, and examples of how to balance the competing policy issues at play when protecting personhood rights.

The existing patchwork of state laws has also presented many challenges in the modern age of borderless, instant communication. Any attempt to update the protection of personhood today must solve this problem through a unified, federal framework that can be consistently applied across state lines, so that all parties have certainty of their rights and responsibilities and to enable efficient enforcement. DiMA strongly believes that an effective digital replica law must preempt related state laws and establish a level playing field suited for the digital commons.

No Secondary Liability
Another important consideration in establishing any new right is who should be liable for its infringement. DiMA’s position is that liability for unauthorized digital replicas should be direct and assigned to the creator of the violative content, not to downstream parties. This reflects the structure of the content ecosystem today, where content providers stand behind the legality of the content they offer and hold their distributors harmless for infringing content. In turn, distributors have processes whereby they help their partners mitigate risks and protect their customers from deceptive or infringing content.

The current laws assigning liability for infringing content to its owners are long-standing and have provided foundational protections that have worked in the effective development and operation of the streaming economy, and the broader content distribution market before it.

Further, we caution against approaches that presume that services hosting the material in question have any ability to determine whether or not it violates a (to-be-determined) digital replica right. Data challenges are prevalent in the music industry – an issue that long predates AI. Works are often distributed to services missing significant metadata identifiers, or with data inaccuracies. There is no identifier for AI-created works to identify them as such, much less something that identifies whether a work is AI-assisted or AI-generated.

First Amendment Protections
Any legislative area that impacts speech requires First Amendment protections. Numerous states have enacted “expressive works exemptions” to their right of publicity laws, recognizing that right of publicity statutes could impede or chill categories of speech. Such exceptions are critically important to preserving First Amendment protected speech, including depictions of individuals for a variety of purposes (e.g., docudramas, biographical purposes, parodies, political cartoons). Any digital replica law must build these safeguards into its structure. DiMA suggests that the existing body of law provides important guidance on this point.

Concerns with Currently Proposed Legislation
Recently proposed legislation, such as the No AI FRAUD Act, fails to consider many of these important guardrails. Indeed, the draft expressly requires courts to balance any First Amendment interests “against the intellectual property interest in the voice or likeness,” which is a novel approach to the ambit of First Amendment protection. It confers immediate liability on services that distribute AI-generated material, with immediate damages and no opportunity for remedy. And by positioning the newly created right as an intellectual property right, the bill departs from the concepts of personhood underlying the existing body of law and runs the risk that these rights would end up assigned away from the very people they were designed to protect and become economic chips to be bargained for—a chilling prospect when it comes to personal identity. IP law by its nature creates a property right and carries other provisions, such as alienability, that are unlikely to be suitable for governing control of an attribute in this context when talking about something as uniquely personal and inherent to an individual as their voice. Moreover, IP law typically comes with certain terms – including a post-mortem right – which we do not believe are appropriate in this context. Coupled with overly broad definitions, this bill would lead to significant new liability and uncertainty.

We are concerned that the legislative proposals introduced to date would create uncertainty, increase barriers to entry for new competition, and have a chilling effect on the current operation of audio streaming and its future growth. We are thus heartened by the Committee’s interest in continuing to explore the right approach to personhood protection today.

Partner for Solutions
DiMA’s members invest significantly to ensure their platforms provide quality content that fans and consumers want to hear. DiMA and its members understand the importance of protecting an individual’s personhood, including voice, and look forward to working with industry partners to identify viable paths forward.
Our members care deeply about protecting against the unauthorized use of name, image, likeness, and voice and stand ready to serve as a resource to the Committee as considerations continue. We look forward to ongoing discussions with Committee members and appreciate your attention to this matter and to the views of all impacted stakeholders.

Sincerely,
/s/
Graham Davies President and CEO, DiMA

 

Cc:
The Honorable Jim Jordan, Chairman, House Judiciary Committee
The Honorable Jerrold Nadler, Ranking Member, House Judiciary Committee