Deepfakes, luxury bags and Ramayana: Unmasking the deepfake trend
The recent uproar over the spread of deepfake videos featuring celebrities has sparked widespread concern. Deepfakes are digitally manipulated videos that alter a person’s face or body to make them look like someone else, often used with harmful intent or to disseminate misinformation. It is a tool for deception. As the technology behind these videos advances, distinguishing between an authentic video and a fabricated one is becoming increasingly difficult. High-profile individuals from the former US president Barack Obama to Bollywood actress Kajol have been targeted by deepfakes.
However, the core problem isn’t the technology itself, but the portrayal of these deepfakes as real footage. Artificial Intelligence (AI) has brought the issue of deepfakes to the forefront, but deception has been around in various forms throughout history.
In the past few decades there has been a surging market for fake luxury products like handbags across the world. Over the years this market has become so diversified that there are well-defined quality standards within the fake handbags called first copy, second copy, etc.
The closer the copy is to the original product, the higher the price. If you go to a grey market in Delhi or anywhere across the world, you are likely to find fake apparels, shoes, software, electronic products, etc. Thankfully, at these places you at least know you are buying a fake product.
The stakes are higher in industries like pharmaceuticals, where counterfeit life-saving drugs pose serious risks. Counterfeit products can be dangerous when they involve human health, as they antagonize the foundational trust in human connections.
Similarly, counterfeit baby products can be extremely harmful. Even on social media, a lot what is portrayed is nothing short of deception. The lives portrayed by some can be far from reality, often masking underlying difficulties and mental health struggles due to the pressure of maintaining an idealized appearance.
The innate human impulse to adopt disguises or employ deceit to disseminate false information or illicitly achieve a goal is deeply rooted and ancient. For instance, in the epic Ramayana, the demon king Ravana assumes the guise of a hermit to kidnap Sita, who was left vulnerable after sending Lakshman to aid Rama, who was pursuing Maricha—a demon “deepfaked” as a golden deer.
However, this concept has been utilized for benevolent purposes too. In the Mahabharata, Arjuna conceals his identity as Brihannala, a eunuch and dance instructor, during the Pandavas’ incognito year during the exile. Likewise, in Hindu mythology, during the cosmic event of Samudra Manthan, Lord Vishnu adopts the enchanting form of Mohini to trick the demons and ensure that the deities receive amrit or the elixir of life.
Deception and disguise are such fundamental aspects of human behaviour that they frequently feature in the narratives of mainstream movies. In films like Tom Cruise’s “Mission Impossible” and Hrithik Roshan’s “Dhoom 2,” characters employ sophisticated trickery, akin to “deepfakes,” to achieve their objectives. Often, priceless artworks and jewellery are swapped out for replicas.
The prevalence of forgeries is not a new phenomenon, yet technological progress has significantly complicated the task of detecting them. With the widespread availability of deepfake applications, virtually anyone can fabricate a video within days. Fraudsters exploit this technology to commit financial fraud by mimicking the voices of individuals’ close contacts and psychologically manipulating their victims.
Consequently, deepfakes are poised to exert a complex influence on various aspects of human life, including social, psychological, financial, and political spheres.
Regardless of a person’s intellect or proficiency with technology, everyone is susceptible to the deception of deepfakes. Efforts to devise technological barriers, such as advanced algorithms to detect these forgeries, are underway. However, the ultimate defence against deepfakes lies in individual discernment. Fraudsters exploit emotional vulnerabilities by presenting enticing, believable falsehoods, often leading individuals to lose their critical judgment.
Hence, maintaining emotional vigilance is crucial to not being misled by deepfakes. It’s essential to scrutinize one’s preconceptions.
In the realm of ancient Indian thought, the quest for truth, distinguishing the real from the unreal, has been a persistent theme. Shankaracharya, a 9th-century philosopher, illustrated this with the analogy of mistaking a rope for a snake, attributing such errors to ignorance, or Avidya. Recognizing authenticity requires overcoming ignorance. In the modern context this implies educating oneself on how deepfakes work and what are be the signs of identifying them.
Throughout history, societies have grappled with and adapted to various forms of deception and disguise. While there have been exceptions, the fundamental trust that underpins societal interactions has largely remained unshaken. Now, we are confronted with the contemporary dilemma of deepfake technology. It remains to be seen whether advancements in AI will continually deceive us or if human acumen will ultimately prevail.