According to Bitget Research, crypto losses due to deep fake scams are predicted to surpass $25 billion in 2024. This alarming increase highlights the growing threat of deep fake technology in cryptocurrency.
Deep Fake Crypto Scams Surge, Expected to Exceed $25 Billion in Losses by 2024, Reports Bitget
Since 2022, criminals have perpetrated $79.1 billion in losses by employing deep fakes at an unprecedented rate.
According to Bitget Research (via Cointelegraph), cryptocurrency losses due to deep fake tactics and scams are expected to exceed $25 billion in 2024, more than double the losses from the previous year.
In a report released on June 27, the crypto exchange reported a 245% increase in the number of deep fakes worldwide in 2024, citing data from previous Sumsub research.
In the first quarter of 2024, Bitget discovered that the crypto industry experienced a 217% increase compared to Q1 2023. China, Germany, Ukraine, the United States, Vietnam, and the United Kingdom had the highest number of deep fakes detected.
Bitget reported that crypto declined $6.3 billion in the first quarter due to increased deep fakes. It further stated that it anticipated losses growing to $10 billion per quarter by 2025.
“Deepfakes are moving into the crypto sector in force, and there is little we can do to stop them without proper education and awareness,” Bitget CEO Gracy Chen told Cointelegraph in a statement.
Intriguingly, the strategies of profound fake fraudsters have not evolved significantly over the years.
Ponzi schemes, phishing attacks, and phony projects are the most common crypto losses caused by deep fakes. These schemes employ deep fake technology to entice cryptocurrency investors.
This method has compensated for over half of all crypto losses associated with deep fakes in the past two years.
“By impersonating influential figures, these schemes create the illusion of credibility and substantial project capitalization, thereby receiving large investments from victims without thorough due diligence,” said Bitget Research.
Michael Saylor, the executive chairman of MicroStrategy, has been a favored target for fraudsters. In January, Saylor disclosed that his team eliminates approximately 80 artificial intelligence (AI)-generated fabricated videos of him daily, which are typically intended to advertise a Bitcoin-related fraud.
Bitget observed that deep fakes are employed in various other applications, such as cyber extortion, identity and impersonation fraud, and market manipulation. For instance, a fabricated news anchor or influencer statement influences a token’s price. Nevertheless, these constituted a significantly lesser proportion than crypto scams.
Deep Fake Crypto Offenses Could Hit 70% by 2026 Without Action, Warns Bitget Analyst
Bitget predicts that if no effective measures are implemented, the percentage of deep fakes utilized in crypto offenses could reach 70% by 2026.
“Criminals are increasingly employing fake photos, videos, and audio to exert a stronger influence over their victims,” Bitget Research chief analyst Ryan Lee told Cointelegraph.
“For instance, a video impersonating someone close to the victim could be pivotal for fraudsters, whereas a fake video of an influencer might bolster investor confidence in a scam project as an ancillary tool.”
Lee is confident that using AI-powered voice impersonators is one of the most pressing concerns regarding deep fake technology. These devices enable fraudsters to contact users, posing as their relatives, and request money.
Another potential threat is deepfakes, designed to evade Know Your Customer (KYC) protocols and obtain unauthorized access to a user’s funds.
“Right now, exchanges need to pay attention to their ‘Proof of Life’ features of the KYC systems the most,” said Lee.
“This feature essentially confirms that the user is a real person and not a static image or a video, through real-time actions like blinking, moving or secondary ‘Proof of Life’ requests.”
“We warn all our users upon registration that we use advanced AI solutions to quickly identify and prevent cases of deepfake usage,” he added.
Photo: Microsoft Bing
TokenPost | [email protected]
<Copyright © TokenPost. All Rights Reserved. >
Credit: Source link