
Revisions to proposed federal legislation fail to protect the public and ensure individual control over digital replicas.
In May, a subcommittee of the U.S. Senate Judiciary Committee held a hearing to consider the dangers and opportunities presented by generative artificial intelligence (AI) technology that can deceptively replicate a person’s voice, likeness, and performances. In addition to inquiring generally into “deepfakes,” one application of that technology, the subcommittee considered a revised version of the No FAKES ACT introduced in April in both the U.S. House of Representatives and the U.S. Senate. A prior version of the bill was introduced in July of 2024.
The bill has ballooned even further from last July’s version—now running 39 pages in length. Sponsors of the revised bill have expanded the originally introduced bill in their effort to satisfy the demands of Google, YouTube, and other big tech companies. The revised draft, however, largely leaves unaddressed the danger that the bill poses for individuals. As currently drafted, the bill does not adequately protect either the ordinary or the famous from losing control of their digital replicas. The bill’s expanded preemption provision also raises a host of uncertainties about how this bill, if passed, would interact with state laws that also cover digital replicas. Hopefully, as this bill continues to be considered in Congress and revised, these concerns will be addressed.
Proponents of the revised bill ostensibly seek to protect performers’ livelihoods and prevent the public from being deceived by unauthorized digital replicas by creating a federal right in such replicas. Unfortunately, as drafted, the revised bill would still work at odds with these objectives and could make things worse than the status quo by legitimizing deceptive uses of digital replicas rather than appropriately regulating them. The current iteration of the bill would protect record labels, large tech companies, the movie industry, and those who seek to profit from and control dead celebrities, but it gives insufficient protection to people at risk of losing control over their digital replicas and the public who may be deceived by deepfakes.
The proposed legislation does not include a federal right of publicity, which would provide a broader federal right to stop unauthorized uses of a person’s name, likeness, voice, or other indicia of identity. Nevertheless, it does extend overlapping rights that control uses of a person’s name, likeness, and voice in the context of digital replicas. As a result, the bill, if passed, would add yet another layer to what I have called the “identity thicket,” in which multiple laws and entitlements conflict over who has control over a person’s name, voice, and likeness, including over digital replicas. Notably, even though the bill nominally has a preemption provision, it does not generally preempt state right of publicity and privacy laws which could conflict with a federal digital replica right.
The changes to the draft bill primarily focus on setting up a more elaborate immunity and notice-and-take-down process that will largely insulate the likes of Google and YouTube from liability, encourage them to take down material, and protect AI software designers. These issues are being well represented before Congress, as are many of the speech-related concerns with the bill. Accordingly, I will focus here on concerns with the bill that are not as likely to be highlighted by the big players involved in drafting and pushing this legislation.
Many of these concerns remain from prior versions of the bill. I hope that if this bill continues to move forward, these concerns will be addressed in future revisions.
As drafted, the bill risks our losing control over our voices, likenesses, and digital replicas. Although the bill thankfully prohibits the assignment of a person’s digital replica while the person is alive, the bill would still allow for long-term and broad licensing of a person’s digital replica, which would undermine protections for those depicted and the public from deception.
Although the bill restricts licensing of a person’s replica to 10-year terms, it allows for the continued use of replicas and derivatives created during the 10-year licensing period, which would make this durational limitation less meaningful. There is also uncertainty about whether future digital replicas could potentially infringe copyrights in earlier replicas or displays of those replicas, further nullifying the relevance of the 10-year limit. Numerous questions remain about how to treat digital replicas within the copyright system that add to uncertainty about how this licensing regime would work. I have previously addressed some of these issues in a recent lecture.
Apart from potential clashes with and ambiguities related to copyright law, the licensing regime insufficiently protects individuals on its own terms. Although the bill thankfully requires that licenses identify the uses to be exercised with a “reasonably specific description,” it is not clear how this requirement would be interpreted by the courts, and future revisions should consider providing even greater clarity about what level of specificity is required. Consider, for example, whether a description of the use of a person’s digital replica as “in audio-visual works” is “reasonably specific”? Similarly, would uses in promotions for a particular brand of soda or apparel be reasonably specific? If these two examples count as reasonably specific—which they should not—the limitation is essentially useless. A person’s replica could appear under such licenses in pornographic contexts and doing things that the person had no real awareness were authorized. Such an outcome would work against the bill’s stated objectives of protecting individuals from being exploited by AI technology and would worsen the deception of the public.
These concerns are exacerbated by the bill’s allowance of licenses agreed to not by the person whose digital replica is being licensed, but by an “authorized representative.” The bill allows an “authorized representative” to license a person’s digital replica without the person’s knowledge. This is a chilling possibility, especially given the reality that many people—particularly younger student-athletes, singers, models, and actors—have signed overreaching agreements related to the management of their name, image, voice, and likeness rights. Revisions to the bill should limit approvals to those given by the person whose digital replica is being licensed and require greater detail about what sorts of uses will be made under the license.
Possible improvements could include requiring that the specific description be particularized for each identified use of a person’s digital replica. The bill could also make allowance that if such a description is not possible, the license could alternatively provide for additional review and consent by the individual.
The bill could also require that such licenses only be entered into with legal representation. Requiring legal representation to enter digital replica licensing agreements would provide even greater protection. Without that requirement, there is a risk that ordinary people will unwittingly agree, particularly online, to broad licenses for uses of their digital replicas or voice clones. Future revisions should protect against these dangers.
The bill does potentially allow for greater protection of union members if a collective bargaining agreement applies. This provision is primarily focused on actors who are part of the union SAG-AFTRA, which represents mostly actors, but it would not provide sufficient protection for nonunion members or union members who may be stuck in agreements they signed before becoming union members or who work with nonsignatories. In addition, the bargaining landscape is likely to be much more challenging in the entertainment industry in the years to come, and the current contract is already due to be renegotiated next year.
The bill would provide better protection to minors by requiring court approval of licenses involving them. Given the broad provision about representatives being able to authorize digital replicas, however, it leaves open the possibility that minors will be bound after they reach the age of 18 by broader contracts entered into by their parents or guardians when they were minors. Minors could also be harmed because the bill as drafted allows the reuse of digital replicas created during the licensing period after the termination of that period.
As drafted, the No FAKES Act could even undo some of the protections provided in the Take it Down Act recently signed into law by President Donald J. Trump. The Take it Down Act importantly facilitates the removal from online platforms of unauthorized intimate visual depictions, including AI-generated ones. But the No FAKES Act would allow broad licenses and authorized representatives to permit the dissemination of such intimate images without the specific knowledge and approval of the underlying person—such uses would be deemed to have been done with authorization. Future amendments to the No FAKES Act should clarify that specific consent for the dissemination of digital replicas—particularly in the intimate image context—must stem from the individual and not from a representative or under a broad license that has not explicitly specified such uses.
The somewhat revised preemption provision in the bill would also create significant confusion. It may also be unconstitutional as drafted. The provision excludes from preemption any state statutes or common law right in existence as of January 2, 2025, “regarding a digital replica” or the production or offering of a service related to the production or dissemination of digital replicas. What is meant by the language “regarding a digital replica” is unclear, which creates a significant challenge. Lots of state laws cover digital replicas even if they do not use the magic phrase and even if they were passed long before we were talking about generative AI. For example, many state intimate image laws and longstanding publicity and privacy common law and statutes cover digital replicas but were not passed or recognized specifically to do so. Do these count as “regarding a digital replica” for purposes of the statute’s preemption provision? It is not clear. It is also not clear what happens if a longstanding statute is amended to address digital replicas more specifically but after the January 2 cutoff. Similarly, it is not clear what happens if a jurisdiction recognizes a common law right, such as a right of privacy or publicity, after January 2. In such instances, the common law analysis is often based on a conclusion that the right always existed but had not yet been recognized or, alternatively, that the claim is an appropriate evolution of preexisting common law. Would either, both, or neither of these common law determinations be preempted by the proposed federal law? Again, it is unclear.
The preemption provision would allow state laws related to “sexually explicit” digital replicas to remain in effect, but again questions loom. Do these laws have to specifically target digital replicas, or can they more broadly protect against the dissemination of sexually explicit images? The preemption provision also excludes state laws regulating election-related digital replicas. Another potential constitutional problem is the inequitable treatment across states. As drafted, the preemption provision gives more authority to states such as California and Tennessee that rushed into the fray to pass specific digital replica laws earlier than other states that have taken more time to address technological changes affecting their states.
In addition, the bill would create a federal postmortem digital replica right, one that would exist after the death of the person upon whose identity it is based. Many states already provide postmortem rights of publicity that would also encompass digital replicas, but this is an area where state laws vary. A federal law could help harmonize postmortem rights, but here the bill harmonizes only digital replica rights, thereby creating another layer of confusion in the identity thicket.
Additionally, the postmortem provision is added without considering its legitimacy or appropriate scope. The proposed federal postmortem digital replica right may create significant wealth tied to dead people. As currently drafted, the proposed postmortem right would incentivize and, in some instances, force the commercialization of the dead against their wishes and their families’ desires. The estate tax system would treat these postmortem digital replica rights as part of the taxable estate of the deceased. This issue could be addressed by excluding such rights from the estate tax and by clarifying that the rights only exist after death.
Future revisions to the bill should also shift the postmortem provision’s focus from wealth generation to protecting against, rather than incentivizing, the commercialization of the dead. In a recent article, my co-author Anita Allen and I concluded that it is appropriate to extend some postmortem publicity rights, including over digital replicas, but such rights should focus on the preferences of deceased people in coordination with their relatives, rather than on creating an industry composed of the dead. The bill should be revised so that all dead people are given equitable treatment. The current revised draft of the bill would unjustifiably give more than 70 years of protection to those who commercialize the dead but a much shorter period of only 10 years to those who do not.
The postmortem provisions would also create a registration process at the U.S. Copyright Office. Given the current tumult at the Copyright Office and significant efforts to reduce the federal workforce, it is hard to imagine the Copyright Office taking on an entirely new burden of managing registrations for digital replicas. Reducing the duration of postmortem rights and having them equitably apply to everyone could eliminate the need for such a registration process while also providing a better balance between the rights of the living and the dead.
This bill is gaining supporters, particularly as it gets longer to address each powerful constituency’s issues. But as it picks up traction, the interests of the underlying individuals and the public should not be lost. Hopefully, future revisions will address these concerns while still adequately protecting record labels, the film industry, and tech companies.