A recent deepfake video featuring Indian actress Rashmika Mandanna has caused both controversy and concern in Bollywood. The video, which purportedly shows Mandanna entering an elevator, was discovered to be a deepfake, a digitally altered video manipulated to make it appear as if the actress was present in the footage.
Mandanna’s deepfake video quickly garnered massive attention, with many on social media completely duped by the video's authenticity.
What is particularly unsettling about this deepfake is how convincingly it deceived the average netizen. However, upon closer examination, subtle signs reveal the video's inauthenticity.
The original video was posted on Instagram on October 8 and featured a woman named Zara Patel. There is no concrete evidence linking Patel to the creation of the deepfake video.
Comparing the genuine video with the deepfake, one can observe that Patel's face is visible at the beginning of both videos. However, after approximately one second, the face morphs into that of the popular actress in Indian cinema.
In response to the video, Mandanna tweeted her response explaining how the experience has been "extremely scary" and highlighting the urgent need to address the dangers of identity theft and the misuse of tech as a community.
The incident has prompted strong reactions from Bollywood, with Big B, Amitabh Bachchan himself publicly supporting legal action in response to Mandanna’s deepfake.
In a separate but related development, Bollywood actor Anil Kapoor recently celebrated a significant legal victory that highlights the intersection of celebrity rights and the misuse of artificial intelligence (AI).
READ MORE: Anil Kapoor records a Jhakaas win against AI fakes in Delhi High Court
Kapoor emphasized the role of AI technology in these cases, indicating that in the future, he could simply issue a court order and injunction to have infringing content removed.
With landmark verdict’s of the sort, it will be interesting to see whether Mandanna proceeds forward with legal action against the misuse of deepfake tech.