How Crypto Scammers Are Using AI Deepfakes to Phish Victims
Source : Gadgets 360
Crypto scammers and hackers are finding newer ways to penetrate safety measures, even as the crypto industry accelerates attempts to add layers of advanced security to various platforms. Hackers and scammers are now tapping into AI deepfakes in order to breach the security of crypto exchanges and Web3 related firms. Using deepfake AI, notorious elements aim to bypass the identification criteria established by platforms, Binance Chief Security Officer Jimmy Su said in a recent interview
Deepfakes are artificially generated photos or videos that designed to convincingly replicate the voice as well as facial features and expressions of an individual — living or deceased. Artificial intelligence (AI) and machine learning (ML) tools are utilised to create deepfakes with realistic graphics.
If scammers succeed in creating deepfakes of crypto investors, it increases their chances of bypassing the security of crypto platforms and stealing user funds. “The hacker will look for a normal picture of the victim online somewhere. Based on that, using deep fake tools, they’re able to produce videos to do the bypass. Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down. The deep fakes are advanced enough today that they can actually execute those commands,” Su told CoinTelegraph .
For some months now, players in the crypto sector have been highlighting the growing threat that AI-generated deepfakes pose to uninformed and unsuspecting victims. In February 2023, a deepfake video of Binance CEO Changpeng Zhao surfaced on social media. In that clip, an artificially generated Zhao can be heard calling people to exclusively trade crypto with them.
Deep fake AI poses a serious threat to humankind, and it’s no longer just a far-fetched idea. I recently came across a video featuring a deep fake of @cz_binance , and it’s scarily convincing. pic.twitter.com/BRCN7KaDgq
— DigitalMicropreneur.eth (@rbkasr) February 24, 2023
A similar deepfake video of Elon Musk sharing misleading crypto investment advice was also spotted on social media earlier this month.
Since these deepfake videos are highly appealing, many people might not be able to spot some warning signs that they are deepfakes. In the coming times, Su predicts that AI will be able to detect the uneven parts of deepfakes and will improve in quality.
“When we look at those videos, there are certain parts of it we can detect with the human eye. for example, when the user is required to turn their head to the side. AI will overcome [them] over time. So, it’s not something that we can always rely on. Even if we can control our own videos, there are videos out there that are not owned by us. So, one thing, again, is user education,” Su said in the interview.
A recent report from blockchain research firm CertiK estimates that a whopping $103 million (roughly Rs. 840 crore) was stolen in crypto exploits this year in April. Exit scams and flash loans emerged as the largest source of stolen funds in crypto crimes. In the last four months of 2023, CertiK estimates $429.7 million (roughly Rs. 3,510 crore) were stolen by crypto scammers and hackers.