- The rapid rise of artificial intelligence technology has allowed scammers to make sophisticated scams with as little as three seconds of a victim’s voice
- Americans lost $2.6 billion to imposter scams in 2022, according to the Federal Trade Commission
Scammers are using artificial intelligence to mount terrifying scams which see social media users’ voices cloned – with fakers then calling their targets’ parents and pretending to be in trouble, before begging for cash for help.
So-called imposter scams, where a fraudster impersonates someone in order to steal money, are the most common scam in the US, losing American’s $2.6 billion in 2022 alone, the Federal Trade Commission reported.
Artificial Intelligence has now supercharged the ‘family emergency’ scam, where the criminal convinces the victim that a family member is in distress and in immediate need of cash.
One in four respondents to a McAfee survey in April 2023 said they had some experience of an AI voice scam and one in ten said they had been targeted personally.
Scammers can need as little as three seconds of audio, which can easily be extracted from a social media clip, to clone a person’s voice, McAfee’s study found.
Eddie Cumberbatch, a social media content creator with more than 100,000 followers, experienced the terrifying reality of AI’s potential to impersonate.
In April Cumberbatch’s grandparents received a phone call that sounded just like Eddie, telling them he had been in a devastating car crash and needed money immediately.
By a stroke of luck, Cumberbatch’s father overheard the conversation and was able to call his son to verify that the crash had not happened and that he was safe.
Despite managing to dodge the scam Cumberbatch described the situation in a TikTok video as ‘absolutely terrifying.’
His grandparents would have taken a second mortgage out to send the requested money, the influencer explained.
A Canadian couple who were targeted by a similar AI voice scam, were less lucky and ended up losing $21,000 Canadian dollars (U.S. $15,449).
Benjamin Perkin’s parents were tricked by an AI clone of his voice that told them he was in jail for killing a diplomat in a car accident.
The voice, which sounded just like Perkins, told the panicked parents that he needed $21,000 for legal fees before going to court.
His frightened parents collected the cash from several banks and sent the scammer money through Bitcoin, Perkins told The Washington Post.
The voice sounded ‘close enough for my parents to truly believe they did speak with me,’ he told the Post.
Although not certain about how the fraudsters obtained his voice Perkins said he has posted videos on YouTube discussing his snowmobile hobby.
Perkin’s parents filed a police report once they realized they had been scammed, but ‘the money’s gone,’ he said.
‘There’s no insurance. There’s no getting it back. It’s gone.’
Requests for money to be sent via a cryptocurrency such as Bitcoin could likely be a scam as they are untraceable and therefore limit the ability to track scammers down, the FTC warns.
Richard Mendelstein, a software engineer at Google, lost $4,000 after receiving a distressing phone call that appeared to be his daughter screaming for help.
He was then told by her ‘kidnappers’ to withdraw $4,000 in cash as a ransom payment.
Mendelstein sent the money to a wiring service in Mexico City and only later realized he had been scammed and that his daughter was safe at school.
The rise in accessible and sophisticated AI making scams quicker and easier to perform, Steve Grobman, McAfee’s chief technology officer said.
‘One of the things that’s most important to recognize with the advances in AI this year is it’s largely about bringing these technologies into reach of many more people, including really enabling the scale within the cyberactor community,’ Grobman warned.
‘Cybercriminals are able to use generative AI for fake voices and deepfakes in ways that used to require a lot more sophistication.’
Vice President Kamala Harris also told CEOs of leading tech companies in May that they have growing moral responsibility to limit the damage to society from their AI products.
Credit: Source link