The draft bill, circulated with bipartisan Senate support, would codify right of publicity protections for artificial intelligence (AI) creations that replicate any individual, not just those with commercially valuable likenesses.

Generative AI seems like it can do almost anything. It can create a new Jimmy Stewart bedtime story. In a snake-meets-tail twist, it can even impersonate Scarlett Johansson promoting an AI-generation app. With its draftNurture Originals, Foster Art, and Keep Entertainment Safe Act (NO FAKES Act), a group of senators is attempting to set some guardrails to protect those whose likenesses are replicated by AI from harm.

The bill is aimed at digital replicas, which cover AI- or computer-generated imagery, voices, or visual likenesses. It only extends to replicas used in sound recordings or audiovisual works. Liability arises when creating a digital replica without consent or publishing, distributing, transmitting, or otherwise making such a replica publicly available. However, the bill requires knowledge that the replica is unauthorized if it is applied against anyone other than the creator or creators.

The right is descendible, meaning it can be exercised by an affected individual's executor, heir, assign or devisee if that individual is dead. A person is protected from unauthorized digital replicas for their lifetime and for 70 years after their death.

The bill also sets strict requirements on licensing digital replica rights. Contracts of adhesion do not count, so adding digital replica licenses to a standard click-through of terms and conditions will not suffice. For a license of such rights to be valid, the licensing individual must either be represented by counsel and have a written license or be covered by a collective bargaining agreement.

The bill contains standard exclusions from liability, including parody and use in a documentary, docudrama or biographical work. But those displaying or otherwise making available a digital replica cannot argue that they did not participate in its creation to avoid liability. As drafted, this language covers social media sites like YouTube, as well as music streamers like Spotify.

This bill is still a draft, though one with the force of bipartisan support behind it. The current legal landscape surrounding right of publicity and related protections has a void this bill could fill, but the NO FAKES Act is still quite narrow.

The draft bill's very existence suggests that existing law cannot contain generative AI's impact. Courts are still grappling with the application of state right of publicity laws to the very problem that the NO FAKES Act attempts to solve.

Currently, individuals' right of publicity is protected by some (but not all) states' statutes and common law. There is currently no federal statutory right of publicity, and state-level protection varies greatly. Plus, these laws are almost entirely silent on digital replicas, or AI creations of individuals' voices or likenesses.

Only New York has enacted a law extending the right of publicity to what it calls digital replicas, or a "newly created, original, computer-generated, electronic performance by an individual in a separate and newly created, original expressive sound recording or audiovisual work in which the individual did not actually perform." Senate Bill S5959D protects performers from sexually explicit "deep fake" material and from all exploitation of their name, image and voice after their death for a period of 40 years (whether or not it is explicit). But the strong language of this law, signed in November 2020, is subject to a carveout that defangs the law almost entirely: if the material contains a "conspicuous disclaimer" that the performance was not authorized by the individual, then there can be no liability. Presumably, the thinking goes that as long as the public is not "deceived" into thinking the individual authorized the performance, then there is no harm.

Existing law is always applied to new situations, so the fact that other states' statutes and common law decisions do not call out AI is not necessarily limiting. The Supreme Court's statement on the right of publicity in 1977 still rings true in the context of AI replicas: the state's interest in protecting this right "focus[es] on the right of the individual to reap the reward of his endeavors," much like intellectual property law generally, as demonstrated by Zacchini v. Scripps-Howard Broadcasting Co., 433 U.S. 562, 573 (1977). However, the Court's reasoning highlights a shortcoming of current right of publicity law, instead largely focusing on the commercial value of the individual's likeness rather than the inherent harm of misappropriating a person's likeness in the first place (See, e.g., In re Clearview AI, Inc., Consumer Privacy Litigation, 585 F.Supp.3d 1111, 1127-1130 (N.D. Ill. 2022)). Damages, and even the ability to adequately plead violation of current right of publicity laws, are tied to the commercial value an individual has (and can prove).

The NO FAKES Act changes the economic calculus of the right of publicity and enshrines a federal right of publicity against digital replicas to be applied to all individuals, not just famous ones who can be paid highly for their likenesses. The draft bill sets a statutory amount of damages per distribution of an unauthorized digital replica of an individual. This amount is the floor of an individual's recovery, not the ceiling, and violators must pay the greater of the $5,000 statutory damages amount or the traditional monetary damages. These amounts do not take into account other forms of monetary relief like attorneys' fees and punitive damages, both contemplated by the statute.

The NO FAKES Act may enshrine federal protections for all individuals against unauthorized digital replicas. But the rush to protect against the harms of AI is leaving a lot on the table. The bill narrowly focuses on digital replicas rather than harmonizing state right of publicity laws on all uses of a likeness, whether they are AI-generated or antique sound-alike, look-alike, and name misappropriation cases. Because AI has so many applications, many as yet unknown, trying to legislate its effects piecemeal could lead to conflict with existing or state law that hasn't yet been fully examined.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.