March 28, 2025 01:46 pm (IST)
Follow us:
facebook-white sharing button
twitter-white sharing button
instagram-white sharing button
youtube-white sharing button
3 cops killed, 2 terrorists shot dead during J&K's Kathua encounter: Report | Kolkata couple sues IVF centre for not revealing daughter's biological parents' identity, blames it for her death | 'India is not Dharamshala', Amit Shah says as Lok Sabha passes Immigration and Foreigners Bill 2025 | 'Now it's our turn': Vladimir Putin accepts Narendra Modi's invitation to visit India | Gold smuggling case: Kannada actor Ranya Rao’s bail plea rejected again | Congress workers clash with police in Odisha during protest against suspension of MLAs | Parvesh Verma and Atishi engage in war of words over 'bhai' comment in Delhi assembly | 'I am amused': Sitharaman on Raghav Chadha’s banking remarks, urges him to use ‘Western exposure’ for India | Mumbai Police denies Kunal Kamra more time to appear, issues second summons despite 'threat to life' claim | Political black comedy at its darkest: MK Stalin counters Yogi Adityanath on language row
Katrina Kaif
Photo Courtesy: Tiger 3 Trailer video screen grab

After Rashmika Mandanna, now Katrina Kaif finds herself as latest victim of deepfake technology

| @indiablooms | Nov 08, 2023, at 03:28 pm

After southern actress Rashmika Mandanna, Katrina Kaif is finding herself as the latest victim of deepfake technology.

In the original picture, Katrina, wearing a towel, could be seen wearing a towel and fighting with a Hollywood stuntwoman.

The picture is part of a high-octane stunt sequence from upcoming movie Tiger 3 which also features Salman Khan.

The edited version that has now gone viral portrays Kaif in a low-cut white top, reported Money Control.

Netizens have criticised the act and one of the users wrote on X: "Katrina Kaif's towel scene from Tiger 3 gets morphed. Deepfake picture is garnering attention and it's really shameful. AI is a great tool but using it to morph women is outright criminal offence. Feels disgusted."

Rashmika Madanna-Zara Patel deepfake row:

The popular social media influencer whose body was used in the viral deepfake video that featured actress Rashmika Mandanna said she was 'deeply disturbed' by the incident.

In a statement posted on her Instagram channel, Zara Patel said: "Hi all, it has come to my attention that someone created a deepfake video using my body and a popular Bollywood actress's face."

"I had no involvement with the deepfake video, and I'm deeply disturbed and upset by what is happening," she said.

"I worry about the future of women and girls who now have to fear even more about putting themselves on social media. Please take a step back and fact-check what you see on the internet. Not everything on the internet is real," the social media influencer warned.

Indian IT Minister Rajeev Chandrasekhar on Monday (November 6, 2023) reacted to a 'deepfake' video featuring actor Rashmika Mandanna and reminded social media platforms of their legal obligations to fight misinformation.

The Minister said under the IT rules notified in April, 2023 - it is a legal obligation for platforms to ensure no misinformation is posted by any user and to remove it within 36 hours when reported by any user or government.

What is deepfake?

According to Encyclopedia Britannica website, Deepfake, synthetic media, including images, videos, and audio, generated by artificial intelligence (AI) technology that portray something that does not exist in reality or events that have never occurred.

Support Our Journalism

We cannot do without you.. your contribution supports unbiased journalism

IBNS is not driven by any ism- not wokeism, not racism, not skewed secularism, not hyper right-wing or left liberal ideals, nor by any hardline religious beliefs or hyper nationalism. We want to serve you good old objective news, as they are. We do not judge or preach. We let people decide for themselves. We only try to present factual and well-sourced news.

Support objective journalism for a small contribution.
Close menu