Recently, a surge of controversial AI-generated videos has flooded the internet. Examples include the widely circulated “Musk-Trump kissing” meme and the provocative image of Biden wearing a diaper. This trend has now reached our borders, with a new video going viral featuring Poonam Pandey kissing comedian Samay Raina on his YouTube show India’s Got Latent. The video gained so much attention that both celebrities were forced to post clarification videos, explaining it was fake and AI-generated. In this age of rapid technological advancement, celebrity images, voices, and personas face unprecedented challenges as AI blurs the lines between reality and fabrication. Once considered a celebrity’s greatest asset, their public image has become a commodity, increasingly exploited in an environment dominated by deepfakes and digital counterfeits. The explosive growth of artificial intelligence has given rise to a new wave of threats, jeopardizing not only public reputations but also the personal rights of these celebrities.
Artificial intelligence has revolutionized the creative world, especially with tools that assist in artistic expression, media production, and effective communication. However, these developments raise worrying cases of misuse. Deepfake technology now creates videos and audio with individuals that are very realistic yet entirely fabricated. It is a technology that creates unauthorized representations of celebrities. From an unauthorized voiceover in promotional material to manipulated graphic content, such violations attest to the dire need for strong legal structure. One trend that presents an important concern is voice modulation technologies that use AI. The current algorithms used can mimic voices with considerable accuracy, posing considerable problems in distinguishing original recordings from fake ones. Many have encountered songs dubbed in Prime Minister Narendra Modi’s voice or the defaming voice overs of celebrities and political figures so accurately that anyone would believe it to be real. Such violations may not be only unethical, but they have the potential to cause greater reputational harm and mental anguish.
The Legal Landscape In India
Personality rights, which grant individuals control over their name, image, and likeness, remain relatively underdeveloped in India. In contrast, countries like the United States explicitly recognize these rights. In India, the legal framework addressing personality rights is scattered across various enactments, making it less clear-cut. The protection offered by the Right to Privacy under Article 21 is not comprehensive. Public figures often turn to trademark laws to safeguard their names and expressions, or to tort law to pursue defamation or appropriation claims. However, in the face of growing digital manipulation, these protections remain insufficient. Deepfakes present new challenges to the contemporary legal framework. Though some redressal options can be found in the Information Technology Act of 2000 and the Copyright Act of 1957, such statutes were not designed for the complexities of AI-originated content. This in turn creates a reactive environment for the legal system whereby public figures are forced into lengthy and expensive litigation procedures to reclaim their rights of image. Without such legislative anticipation, they remain susceptible to continued abuse.
The Cost of Digital Exploitation
The consequences of AI misuse go very deep. Misappropriation of an image for the purposes of faking endorsements or explicit content can lead to damage to reputation, financial loss, emotional distress and erosion of trust.
Finding Solutions In A Digital World
Over these concerns, India has to take an all-encompassing measure that ranges from legislation, technological safeguard, and raising social awareness. First of all, there is an urgent need for well-rounded legislation that respects personality rights. Such legislation should specifically prohibit unauthorized use of names, photographs, voices, and other attributes with heftier penalties for violation.
The other focus should be on dedicated AI regulations regarding deepfakes and voice modulation. Watermarks on AI-generated content or disclosure requirements for manipulated media can also help reduce the misuse of this technology. Blockchain technology can be applied to authenticate digital content and provide celebrities with tools for protecting intellectual property.
Public awareness campaigns can enhance the ability of audiences to distinguish between authentic and spurious content. Promotion of media literacy will decrease the demand for fake celebrity material and hence its incentive to create. Meanwhile, celebrities and their representatives must remain vigilant about platforms where their image is used without permission and respond swiftly in such situations.
India can learn a great deal from international precedents. In the United States, the right of publicity gives individuals control over the commercial exploitation of their identity and thus provides a clear-cut legal recourse. Further, the General Data Protection Regulation of the European Union has provided significant protection to personal information that could potentially be adapted to include rights concerning images and vocal recordings. By using parallel frameworks, India can thus arm its celebrities with just the necessary tools to provide protection to their identities during a constantly digitalizing social environment.
Conclusion
As artificial intelligence technologies advance, the inappropriate exploitation of celebrity image rights is expected to increase. Aggressive legislation, advanced technology, and increasing awareness in the public might equal out and help protect the rights without offending artistic freedom. The image of a celebrity is their identity, how they earn, and who he or she will be long after he or she leaves this world. This, both legally and morally speaking, has to be guarded against wrongful application. It becomes more important to protect the core elements of your identity at such a time when technology continues to make reality seem unreachable.
Authors: Riya Gupta & Harsehaj Bir Singh