Beware! Deepfakes of famous personalities promoting betting apps, fraud investments
Expressing his anguish, Tendulkar said that the ‘rampant misuse of technology is disturbing.’ He asked social media platforms to be alert and responsive to the complaints.
By Md Mahfooz Alam Published on 15 Jan 2024 4:58 PM GMTHyderabad: The emergence of deepfake videos created using Artificial Intelligence (AI) voice cloning technology has become a menace to society.
While fighting a barrage of photoshopped tweets and images has become a norm, people thought that at least videos could be the last bastion of truth. But, people are now using the latest technological advancements in media manipulation to create fake videos that can even convince a trained eye.
Deepfakes of people, especially famous people, are doing rounds on the Internet showing them doing or saying things which are not real but look real nevertheless.
How deepfakes are created?
First, a video of prominent public figures such as sportspersons, actors, politicians and industrialists available in online spaces is chosen.
Then, the video is run through an AI software where only the mouth movements and the voice of the person are replaced convincingly. The end video will look the same as the original except now the person in the video is speaking whatever the person who edited the video wanted them to say.
Currently, there is an influx of such videos on social media. Scammers are re-engineering videos of prominent personalities, making it look like they are promoting or endorsing shady betting apps and investment schemes.
What can the government do?
Sachin Tendulkar posted one such deepfake video, which shows him promoting a gaming app called Skyward Aviator Quest, available on the Apple App Store. Tendulkar or more accurately, a convincing version of his voice, can be heard speaking in Hindi that earning money has become easy these days.
In the video, his voice can be heard saying, that his daughter, playing a game called Skyward Aviator Quest, earns one hundred eighty thousand rupees every day.
After coming across the video, Sachin Tendulkar took to X requesting people to report such videos, ads and applications promoting fraud.
Expressing his anguish, he said that the ‘rampant misuse of technology is disturbing.’ He asked social media platforms to be alert and responsive to the complaints, which could help stop the wide circulation of misinformation and deepfakes.
These videos are fake. It is disturbing to see rampant misuse of technology. Request everyone to report videos, ads & apps like these in large numbers.
— Sachin Tendulkar (@sachin_rt) January 15, 2024
Social Media platforms need to be alert and responsive to complaints. Swift action from their end is crucial to stopping the… pic.twitter.com/4MwXthxSOM
Union Minister Rajeev Chandrasekhar thanked Tendulkar for posting the video and warned social media platforms of legal violations asking them to curb and prevent deepfakes and misinformation powered by AI. He also said that the government will be coding stricter rules under the IT Act to ensure compliance by platforms.
More deepfakes
Similarly, a video showing finance minister Nirmala Sitharaman announcing in English a strategic partnership with Quantum Trade, a crypto trading platform, is in circulation on social media. However, in the original video, she was speaking Tamil and calling out the Tamil Nadu government for inefficiencies in flood mitigation and relief efforts.
In another instance, industrialist Ratan Tata called out a deepfake of him ‘giving’ investment advice on Instagram. The fake video was shared by an Instagram user named Sona Agarwal. The video’s caption gave a ‘chance’ to users to increase their investment ‘risk-free’. Tata’s voice, in the fake video, referred to Agarwal as his manager.
Another Deepfake showed actor Shah Rukh Khan promoting a game called Aviator. Khan’s voice could be heard making dubious assertions in the ads promoting the Aviator game, including claims like ‘everyone wins in this game’. This video had over 50,000 views, according to The Indian Express.
Another deepfake video showed Kohli saying he won “a huge amount of cash” playing the game and how his “hands are still shaking” after the win. The article also mentioned a clip of a newscaster from an Indian TV channel speaking about how Kohli won Rs 8,00,000 on a betting game.
One of the most recent deepfakes is of journalist Amish Devgan ‘reporting’ on the benefits of a casino gaming app called 11Winner. The off-syncing between the audio and the lips movement is a giveaway that it is a deepfake. We also did not find any report on the website of News18 or its YouTube channel about Devgan promoting the Casino App.