Recently numerous celebrities and influencers decided to join Meta’s AI chatbot program in exchange for commercial benefits. The beta version of this program was launched in the United States on September 28, 2023. Meta further announced plans to release the voice versions in January 2024 and has already acquired the authority to use these personalities’ likeness, image, voice to create AI-powered chatbots with profiles on Instagram, Meta and WhatsApp, with their own unique personality and character descriptions that inexplicably seem dissimilar from what the personality is known for. These chatbots are the latest version of the generative AI tools that participants of the writer’s as well as actor’s strike are strongly condemning the use of, in creating scripts and replicating voice and acting performances without the authorization or even involvement of the artist in question.
This brings us to the dilemma we are faced with today, that regardless of the jurisdiction and level of success achieved by a celebrity, it is extremely difficult to control the negative use of their personality and likeness keeping in mind the rapid technological developments, with the rise of deepfakes and morphed content.
Though personality rights are not given any statutory recognition in India, they are considered copyright-like rights, as these celebrities are often working in capacity of a “performer”. However, the Indian judiciary has laid down progressive judgements in various matters and stated that famous personalities have the and benefit from the commercial exploitation of their goodwill and associated traits. Further, the Courts stressed the importance of maintaining a sense of privacy in their personal life. It must be noted that such rights are not only enforced only when one’s image and name is stolen and plastered to promote infringer’s brand, or when private information about a personality is used, but also when their name, image and voice is used to carry illegal activities such as in the ‘Kaun Banega Crorepati’ lottery scam.
Unfortunately, such rights are limited in India, as they vanish after the personality ceases to exist, as clarified by the Madras HC, that late CM Jayalalithaa’s niece Deepa could not restrain others from using the CM’s personality rights after her death, and that they could not be inherited, citing the rationale that personality rights are inseparable from the intrinsic and inherent personality and signature style of the deceased celebrity and do not transfer to their relatives, regardless of blood relation or degree of ‘closeness’. This notion was also observed by Justice Sanjeev Narula of the Delhi High Court when the application for temporary injunction submitted by late Sushant Singh Rajput’s father was rejected.
Despite Sushant’s father being a Class II heir, his plea was rejected as Justice Narula reiterated the position that posthumous personality rights, as recognized in other jurisdictions such as Canada for at least 14 years after death and the US for 70 years after death, would not be granted, while also holding that posthumous privacy rights were prima facie unenforceable in Indian courts.
However, in light of the technological advancements in Meta AI Chatbots, the Indian judiciary has been very progressive. The Delhi High Court took a strict approach in order to prevent any derivation of commercial benefits from the misuse of personality and other aspects of actor Anil Kapoor’s ‘persona’, in order to create any forged autographs/photographs, false endorsements etc., using technological tools such as AI, deepfakes, face morphing, etc., for commercial purposes.
The general presumption is that the likelihood of fair use is higher in case the infringer has not derived some commercial benefits out of such use. Though this is mostly applicable to copyright law, we can draw a parallel to the Court’s vehement objection in Anil Kapoor’s matter to collection of fees in lieu of services offered on the defendant’s unauthorized platform for organizing a false endorsement of the actor’s appearance at a fictional event. Such deceptive practices by the defendants not only infringes upon personality rights of a celebrity but also tricks and misleads the consumers and is violative of ‘Guidelines on Prevention and Regulation of Dark Patterns, 2023’ published by the Ministry of Consumer Affairs.
Today these AI tools can also morph a celebrity’s image in an ‘adult’ or ‘obscene’ manner, amounting to an infringement of one’s publicity rights and right to privacy. This notion has also been identified in the recent Anil Kapoor’s matter condensing the use of ‘dangerous’ and ‘freely available’ AI tools.
The Way Forward – Feasibility of Suitable Licensing Models
In March 2023, legendary actor Michael Douglas said he was considering licensing his name and likeness so that the rights go to his family than to the Metaverse. Canadian musician Claire Boucher also known ‘Grimes’ permitted others to use generative AI software ‘Elf.Tech’ to convert its voice samples to resemble hers and incorporate them in its original copyrighted songs. However, this came with three conditions: approval before release, due credit, and royalty shares.
Unfortunately, in India, the negotiating power is tilted against the performers, since the terms of the contract require performers to assign all the rights in their performances to the producer /record label, in perpetuity across all media/formats, which technically does not prohibit them from generating new material, without taking permission from or giving revenue share to such performer, raising ethical and moral rights concerns.
It is quite possible that enforcement through traditional John Doe injunctions may become outdated in the near future, such as the takedown of specific domain names in the Anil Kapoor order, when websites could create mirror websites in a flash by tweak of a single letter/digit. Thus, a Court judgement clarifying that dynamic injunctions do not only apply to copyright infringement, but also to infringement of copyright-allied rights, including personality rights, may be helpful in this regard.