Even as the crypto sector speeds up attempts to add layers of enhanced protection to various platforms, fraudsters and hackers are discovering novel ways to circumvent security measures. Hackers and con artists are now using AI deepfakes to penetrate the security of cryptocurrency exchanges and businesses that are tied to Web 3. Jimmy Su, the chief security officer at Binance, stated in a recent interview that infamous actors use deepfake AI to get beyond the platforms’ identification standards.
Deepfakes are synthetically produced images or videos that are intended to accurately mimic the voice, as well as the characteristics and expressions, of a person, either alive or dead. Deepfakes with realistic visuals are produced using methods from artificial intelligence (AI) and machine learning (ML).
Scammers’ chances of evading crypto platforms’ security and taking user funds grow if they are successful in producing deep fakes of crypto investors. “Wherever possible, the hacker will search online for a typical image of the target. They can create films to do the bypass based on that and employing deep fake techniques. For instance, the user may be required to blink their left eye, gaze to the left or right, up or down, or to the left or right throughout some of the verification. Today’s deep fakes are so sophisticated that they can really carry out those requests, according to Su, who spoke to CoinTelegraph.
Players in the cryptocurrency industry have been emphasising the rising danger that AI-generated deepfakes represent to unaware and unprepared victims for a while now. A deepfake video of Binance CEO Changpeng Zhao appeared on social media in February 2023. In that video, a synthetic Zhao can be heard pleading with people to exchange cryptocurrency with them only.
Earlier this month, a similar deepfake video of Elon Musk giving false advise on investing in cryptocurrencies was discovered on social media.
Many individuals might be unable to recognise some red flags that these films are deepfakes because of how alluring they are. Su anticipates that AI will get more sophisticated and be able to recognise the uneven components of deepfakes in the next years.
“There are some elements of those videos that the human eye can pick out when we watch them. when the user must, for instance, tilt their head to the side. Over time, AI will triumph over them. As a result, we cannot always rely on it. Even while we have some control over our own recordings, there are certain videos out there that we do not own. So, user education is one thing, once again,” Su added in the interview.
According to a recent analysis from blockchain research company CertiK, the month of April this year saw the theft of a staggering $103 million (approximately Rs. 840 crore) through cryptocurrency vulnerabilities. Exit scams and flash loans have been the main ways that money is taken in cryptocurrency crimes. According to CertiK, hackers and crypto fraudsters stole an estimated $429.7 million (approximately Rs. 3,510 crore) in the final four months of 2023.
Credit: Source link