How to avoid falling victim to an AI deepfake voice scam

So-called ‘Deepfake’ voice-cloning scams are booming, due to the sheer ease of making a convincing voice clone of a person.

Upset nervous young woman become online scammers victim
Voice cloning scams are booming in Britain. (Getty)

The innocent act of sharing a video online could lead to a new form of AI-enhanced scam - where a lifelike ‘voice clone’ of a person is used to ‘call’ family members and friends to steal money.

So-called ‘Deepfake’ voice-cloning scams are booming due to the ease of making a convincing copy of a person's voice.

Research this year by Starling Bank found that 46% of British people still don’t know such a scam exists - but 28% of people believe they have already been targeted by AI voice scammers, the campaign, which stars actor James Nesbitt, found.

Yahoo spoke to top British cybersecurity experts about the telltale signs of a voice cloning scam and how to stop scammers in their tracks.

In a typical voice-clone scam, people are contacted either by a friend or family member, who suddenly needs money, or are contacted in the workplace to authorise large transfers of money.

Like other scams, cybercriminals will typically try to create a sense of urgency, such as claiming that someone is trapped abroad.

A Microsoft demonstration was able to create a voice clone from a three-second sample of someone’s voice - meaning that ordinary social media posts can be fuel for attacks.

Online software such as Eleven Labs requires just minutes of a person’s voice to create a clone which can say anything in response to typed inputs.

Cellphone with logo of American artificial intelligence company ElevenLabs Inc. in front of website. Focus on center-left of phone display.
Online AI software such as Eleven Labs requires just minutes of a person’s voice to create a clone. (Getty)

Eleven Labs’ tech was used to create a ‘clone’ of Joe Biden which was used to target Democrat voters with fake voicemails to prevent them from voting.

Research by cybersecurity company McAfee in 2023 found that one in ten respondents have experienced some kind of AI voice scam, or know someone who has been targeted - with 77% losing money as a result.

Starling Bank, which conducted this week’s research, suggested family members use a ‘safe phrase’ during calls so they know it’s the real person, not a clone.

Brian Higgins, security specialist at Comparitech said that using a ‘safe phrase’ is the ‘absolute best advice’.

But he said that automated ‘bots’ in attacks can be easily disoriented if you feel suspicious.

Higgins said: "Bots are coded to react to individual words, so if you use some random language at the beginning of your call the code won’t work. Be shouty and unusual."

Audio deepfakes are good, but they are not always perfect and if you are suspicious, you should listen out for giveaways, one expert told Yahoo UK.

You should also ask lots of questions to try and trip up potential scammers, says Simon Newman, Co-Founder of Cyber London and International Cyber Expo Advisory Council Member.

Newman said: "For audio deepfakes, there are a few tell-tale signs. Listen out for slurring of certain words, or a lack of emotion in the way the person speaks.

“If you think you’re being called by a deepfake, ask them lots of questions that only the real person will know the answer to. Do not give out any personal information until you are completely satisfied that you can confirm the identity of the person on the other end of the line.

The scammers will try and impersonate family members. (AP)
The scammers will try and impersonate family members. (Getty)

"One way of doing this is to put the phone down and call them on the number you usually use for them.

“As technology improves, it will become even harder for people to spot deepfakes. It’s therefore important that if you have any doubt, end the call and report it straight away to Action Fraud."

Voice clone scammers rely on exactly the same tactics as ordinary scammers, which means you should be suspicious about any unexpected call, says Javvad Mallik, lead security awareness advocate at KnowBe4.

Scammers will try to put emotional pressure on by creating artificial ‘urgency’, Malik says - and that is the time to step back and think carefully about whether it is the real person.

Malik said: "When dealing with any scam, including deep fakes, one should consider whether the communication is expected, whether the person is making an unusual request, and finally whether they are trying to be pressured into making a quick decision.

“For example, setting a deadline to meet for work purposes, or through intimidation that by not complying there could be repercussions or any other emotionally charged tactic such as claiming that a loved one is in danger.

"If these red flags are spotted, then people should pause to think about whether this is genuine communication or not and seek to verify the authenticity through other means."

If you’re unsure about a call or need advice, contact Action Fraud on 0300 123 2040 or visit the Action Fraud website for more advice on fraud including deepfakes.

Read more