In 2025, digital avatars are no longer a novelty—they are fast becoming central to commerce and entertainment. From digital replicas of real individuals to AI-generated personalities, companies are rapidly finding ways to deploy digital avatars to create a competitive edge. Globally renowned artists, such as KISS and ABBA, have leveraged digital replicas to enhance or lead entire performances. Pop labels SM Entertainment and PLAVE have launched virtual idol acts featuring entirely synthetic artists and pop groups. Meanwhile, online influencers are deploying digital replicas to extend livestreams during times that they are offline. Human streamers take shifts with their digital counterparts, who continue their sales while they sleep or rest. Executives from PLTFRM, a Shanghai-based marketing agency, shared that its virtual sales avatars, powered by AI-generated video and script, consistently outsell human sales representatives.[1]
These flourishing use cases for digital avatars are accompanied by growing concern from policymakers about the potential misuse of individuals’ identities and personas. Digital avatars sit at the nexus of several evolving legal regimes, including intellectual property rights, publicity rights, and consumer protection. Companies should therefore stay aware of the changing legal and regulatory landscape to understand how regulators will scrutinize their uses of digital avatars.
Recent Legal and Regulatory Developments
Several U.S. states have recently enacted laws that directly address the use of digital avatars:[2]
TN Elvis Act
In 2024, Tenneessee enacted the Ensuring Likeness, Voice, and Image Security (“ELVIS”) Act to broaden the right of publicity—which already protected individuals’ likenesses—to also cover the unauthorized commercial use of an individual’s voice, whether living or deceased.
Effective July 1, 2024, the ELVIS Act prohibits three types of conduct: (i) knowingly using an individual’s name, photograph, voice, or likeness without consent (directed at any person other than the individual) for the purposes of advertising or fundraising; (ii) knowingly publishing, performing, or otherwise making available to the public an individual’s voice or likeness without authorization; and (iii) distributing, transmitting, or otherwise making available any software or other technology where the primary purpose is to generate unauthorized replicas of an individual’s voice or likeness.
Notably, the ELVIS Act defines “voice” to include both recordings of an individuals’ voice as well as digital recreations. The scope of this definition may encompass not only the use of digitally generated voices trained on actual recordings but also analog impersonations by human performers.
- Key Takeaways:
- The ELVIS Act prohibits the non-consensual use or imitation of any individual’s voice or likeness in a commercial setting.
- It also creates civil liability for distributors of tools for generating digital replicas.
NY Digital Replica Law
New York recently expanded contractual protections for artists and performers in response to the rise of digital replicas. New York Senate Bill 7676B, effective January 1, 2025, regulates contracts for the creation and use of digital replicas of an individual’s voice or likeness. It voids and renders unenforceable any contract for a “new performance” by digital replication when the following three conditions are met:
(i) The creation and use of a digital replica of an individual’s voice or likeness would be used in place of work the individual would have otherwise performed in person;
(ii) The contract provisions do not include a “reasonably specific” description of the replica’s intended use (unless the uses are consistent with the terms of the contract and nature of the work); and
(iii) The individual was not represented by legal counsel or a labor organization representing workers that negotiated their digital replica licensing.
- Key Takeaways:
- NY Senate Bill 7676B sets a higher standard for performer contracts to protect human performers from having their voice or likeness misappropriated or used to reduce job opportunities.
California Digital Creative Rights Laws
As we have written before, California has also expanded protections for artists and performers, recently enacting laws regulating creative rights in the context of digital avatars.[3] Enacted shortly after New York’s Senate Bill 7676B, California’s AB 2602, effective January 1, 2025, similarly renders unenforceable any contract provisions that allow for the creation and use of a digital replica of an individual’s voice or likeness, in place of work an individual would have otherwise performed in person, when:
(i) The provisions do not include a reasonably specific description of the intended uses of the digital replica; and
(ii) The individual was not professionally represented by legal counsel or a labor union in negotiating the contract.
Also effective January 1, 2025, is AB 1836, which provides that any person who produces or makes available the digital replica of a deceased personality’s voice or likeness in an expressive audiovisual work or sound recording, without consent, is civilly liable to any injured party.
- Key Takeaways:
- As in New York, California enforces new protections for performers whose voice and likeness may be used in digital replicas.
Oregon Healthcare Title Law
In June 2025, Oregon took a significant step towards regulating digital avatars in the healthcare industry by enacting House Bill 2748, which prohibits nonhuman entities, including AI agents, from using certain titles associated with healthcare practitioners, such as “Registered Nurse,” “Licensed Clinical Nurse,” and others. The law, which goes into effect January 1, 2026, is intended to provide additional clarity to patients and other individuals as to when they are interacting with a human or non-human healthcare provider.
- Key Takeaways:
- Oregon has taken a significant step to regulating the use of digital avatars in the healthcare context.
- This law represents an important step to distinguish human healthcare providers from automated alternatives.
Arkansas Publicity Rights Law
Effective February 2025, Arkansas enacted HB 1071, which expanded state publicity rights laws to address AI-generated digital replicas and simulations of an individual’s image or voice. The law amends the definition of “likeness” to expressly include AI-generated images, videos, or other three-dimensional representations. Furthermore, while existing law did protect individual voices, HB 1071 adds a new definition of “voice” that encompasses digital replicas, covering any sound “readily identifiable and attributable” to a particular individual, including both recordings and simulations of the individual’s voice. As in the case of the ELVIS Act, this phrasing is broad enough that it could potentially include analog simulations, such as impersonations.
Note that Arkansas law provides a safe harbor liability exemption for network or system service providers that unknowingly transmit infringing likenesses. HB 1071 expands the scope of that safe harbor to include the unknowing transmission of voices as well.
- Key Takeaways
- Arkansas expands state publicity rights law to address AI-generated likenesses and voices.
FTC Rule on Impersonation
In February 2024, the FTC finalized a rulemaking to address impersonation of any government or business entity, including its officers. This is not a new prohibition—this type of behavior was already outlawed. However, the new rule allows the FTC to impose civil penalties to help consumers recover losses from offending parties.
Effective April 1, 2024, the FTC rule deems it unfair or deceptive to falsely pose as a government or business entity (or its representative or employee) in a way that affects an individual’s commercial decisions. While the language surrounding the rule’s announcement primarily focused on fraudulent behavior, the rule may ostensibly be applied to the unlawful use of the likeness of any individual with a qualifying affiliation to a government or business.
- Key Takeaways:
- The FTC now has broader authority to investigate the unlawful impersonation of government and business entities, and to impose monetary penalties.
Digital Avatar Litigation
Lehrman v. Lovo Inc. (Southern District of New York)
Two professional voice actors provided voiceover work for an anonymous client—later discovered to be AI voiceover startup, Lovo Inc. (“Lovo”). Paul Lehrman and Linnea Sage allege that Lovo unlawfully used their voices to create and sell AI-generated text-to-speech voice clones. The BBC reported that, according to messages with the actors, Lovo had promised that the voices would be used for “academic research purposes only” and that the work would not be disclosed externally.
Both actors filed suit in federal court against Lovo, asserting federal copyright and trademark claims, as well as state law claims regarding consumer protection, publicity rights, and contract law. The court dismissed most of the plaintiffs’ trademark infringement and false advertising claims, holding that the plaintiffs’ voices were neither protectable trademarks nor used to endorse a particular product or service. It also dismissed (with leave to amend) certain copyright infringement claims, in part because the U.S. Copyright Act only protects original recordings, not the qualities of a voice that merely simulate the original. The court also dismissed several state common law claims.
However, the court allowed the actors to proceed with certain claims related to state publicity rights and consumer protection law, as well as federal claims regarding infringement of Sage’s original voice recordings.
On the Horizon: Congressional Involvement
Federal legislation concerning digital likenesses is also under consideration. The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2024 (“NO FAKES Act”) was re-introduced in the Senate in April 2025 with significant revisions to the prior draft incorporating stakeholder feedback. The bill would establish an unprecedented digital replication right, a federal right of publicity that would allow rights holders to authorize the use of an individual’s voice or visual likeness in a digital replica or in connection with a product or service. The right would not be assignable during the life of the individual, and would only be licensable by the rights holder. Upon the death of an individual, the right would be transferable or licensable by the individual’s inheritors or licensees, and ownership could be transferred in whole or in part to others.
The NO FAKES Act would also hold liable any individuals or entities involved in creating or enabling the distribution of unauthorized digital replicas, with certain carveouts. For instance, online services that distribute unauthorized replicas would be exempt from liability if they did so unknowingly, or if it were technically infeasible to disable access to the offending replica. The bill also contains a safe harbor for individuals or companies that merely distribute a product or service, unless it is designed primarily to generate unauthorized digital replicas. Moreover, online services that distribute unauthorized digital replicas also enjoy the safe harbor, so long as they have taken steps to comply with notice and takedown procedures.
In connection with the notice and takedown process, the revised bill introduces the concept of digital fingerprinting requirements, which use cryptographic hash functions (or the equivalent) to provide unique labels for digital content. To comply with takedown requests, online services must remove or disable access to all other publicly available instances of the unauthorized replica that match its digital fingerprint (as soon as technically and practically feasible).
The bill would also provide subpoena power for rights holders, allowing them to obtain identifying information of alleged infringers from online service providers through court-issued subpoenas.
Best Practices
- Obtain clear, informed, written consent from individuals before creating or using their likeness or voice.
- For contracts governed by New York or California law:
- Ensure that contracts involving digital replicas include reasonably specific descriptions of their intended use (especially in contexts involving work that would otherwise be performed in person); and
- Require that individuals are represented by legal counsel or a labor union in negotiations involving their digital replicas, especially for performers.
- Track state and federal laws to update compliance obligations.
- Monitor ongoing litigation to adjust risk management strategies.
- In sensitive industries (e.g., healthcare, finance), carefully control avatars from using professional titles.
- Establish internal review processes before publishing or distributing digital avatars.
[1] Alexandre Ouairy, the cofounder of PLTFRM, stated that its virtual sales avatars, which rely on AI video models and LLMs to generate visuals and scripts, consistently outsold human sales representatives. Livestream sales since switching to AI avatars are up 30%. https://www.wired.com/story/artificial-intelligence-tiktok-shop-ecommerce-china/
[2] Regulators outside the United States have focused on the risks of digital avatars as well. In the E.U., for example, the Artificial Intelligence Act requires clear labeling of “deepfake” content, as discussed in our Client Alert on the EU AI Act. https://www.mofo.com/resources/insights/240314-eu-ai-act-landmark-law-on-artificial-intelligence
[3] See our Dec. 2024 Client Alert on California’s Landmark Deepfake Legislation. https://www.mofo.com/resources/insights/241211-2024-year-in-review-navigating-california-s-landmark-deepfake-legislation