Generative AI is now being harnessed by cybercriminals to create state-of-the-art voice cloning scams.
A McAfee study of 7,000 people worldwide revealed that one in four respondents had either experienced this type of scam themselves or knew somebody else who had.
With just a small audio sample, scammers are able to clone anyone’s voice and use it to send messages by voicemail and voice messaging texts.
McAfee’s survey found that 70 percent of people could not confidently differentiate between a real voice and a cloned one.
Amy Bunn, Chief Communications Officer at McAfee, said:
“Cybercriminals create the kind of messages you might expect. Ones full of urgency and distress.
“They will use the cloning tool to impersonate a victim’s friend or family member with a voice message that says they’ve been in a car accident, or maybe that they’ve been robbed or injured.
“Either way, the bogus message often says they need money right away.”
Alarmingly, ten percent of those surveyed had received a message from an AI voice clone, and 77 percent of these people lost money as a consequence.
Of those who lost money, 36 percent had lost between $500 and $3000 dollars and seven percent lost between $5,000 and $15,000.
The study also showed how easy it is for cybercriminals to get their hands on real voice files to create clones.
With 53 percent of adults sharing their voice data at least once a week and 49 percent doing so up to ten times in a week, it provides reams of potentially hackable data.
Sometimes it is made even easier for criminals as so much voice data is now shared online via video posts on YouTube, reels on social media, and podcasts.
Accessibility to AI voice cloning tools does not make cybercriminals’ lives any harder either.
McAfee Labs has found more than a dozen freely available on the web and the tools only require a basic level of experience to use.
One of these tools was able to create a clone with an 85 percent match to the original voice. Training the data models can produce a 95 percent voice match from only a small amount of original audio data.
Accents from around the globe are also easy to replicate, although people who speak with an unusual pace or style were harder to imitate, making these people less likely to be cloned.
How the Scams Unfold
McAfee’s survey uncovered the typical approach AI voice cloning scammers were using, as well as its effects on victims.
It found that 45 percent of respondents replied to voicemails or voice messages if they appeared to be from a friend or loved one asking for money, especially if it is a partner, mother, or child.
As Bunn explained above, cybercriminals usually pretend to be in distress and the study’s results confirm that messages purporting to have been in a car accident (48 percent), a victim of a robbery (47 percent), needing help abroad (41 percent), or losing a wallet (43 percent) have a high chance of eliciting a response.
These messages, known as “spear phishing”, target individuals using specific information that seems credible enough to take seriously.
Cybercriminals often ask for payment methods that are difficult to track or recover, including gift cards, reloadable debit cards, wire transfers, and cryptocurrency.
In order to protect yourself from these scams, McAfee advises people to:
- Set verbal codewords with trusted friends and family.
- Call the person back if you have any doubts.
- Think twice before sharing your voice on social media.
- Use identity monitoring services.
- Remove your name from data broker sites.
Cybercriminals claim to have used Slack to steal source code from EA. The game publisher confirmed the breach in security last week.
Earlier this month, Tony Martino, the Chief Executive Officer of Tollring discussed its solution which automatically hangs up on fraudulent calls.
Credit: Source link