The quick rise in various artificial intelligence (AI) platforms have caused a number of security concerns. McAfee recently published a report highlighting how AI has made it easier to create online voice scams. 

The report surveyed 7,054 people from seven countries and found that a quarter of adults had previously experienced some kind of AI voice scam, with one in 10 targeted personally and 15% saying it happened to someone they know. Seventy-seven percent of victims said they had lost money as a result.

Fifty-three percent of adults sharing their voice data online at least once a week (via social media, voice notes and more.) and 49% doing so up to 10 times a week.

The research reveals scammers are using AI technology to clone voices and then send a fake voicemail or call the victim’s contacts pretending to be in distress — with 70% of adults not confident that they could identify the cloned version from the real thing.

Nearly half (45%) of the respondents said they would reply to a voicemail or voice note purporting to be from a friend or loved one in need of money, particularly if they thought the request had come from their partner or spouse (40%), parent (31%) or child (20%). For parents aged 50 or over, this group is most likely to respond to a child at 41%. Messages most likely to elicit a response were those claiming that the sender had been involved in a car incident (48%), been robbed (47%), lost their phone or wallet (43%) or needed help while traveling abroad (41%).

But the cost of falling for an AI voice scam can be significant, with more than a third of people who’d lost money saying it had cost them over $1,000, while 7% were duped out of between $5,000 and $15,000.

The survey also found that the rise of deepfakes and disinformation has led to people being more wary of what they see online, with 32% of adults saying they’re now less trusting of social media than ever before.

The report also covered the accessibility, ease of use and efficacy of AI voice-cloning tools. Both free and paid tools are available, with many requiring only a basic level of experience and expertise to use.

The more accurate the clone, the better chance a cybercriminal has of duping somebody into handing over their money or taking other requested action. With these hoaxes based on exploiting the emotional vulnerabilities inherent in close relationships, a scammer could net thousands of dollars in just a few hours.

Using the cloning tools they found, researchers discovered that they had no trouble replicating accents from around the world, whether they were from the US, UK, India or Australia, but more distinctive voices were more challenging to copy. For example, the voice of a person who speaks with an unusual pace, rhythm or style requires more effort to clone accurately and is less likely to be targeted as a result.