Research released by Starling Bank finds 28% of people have been targeted at least once in the past year
Starling Bank says people frequently post content online that gives scammers access to their voice. Photograph: Dominic Lipinski/PA
Consumers have been warned that their social media videos could be exploited by scammers to clone their voices with AI and then trick their family and friends out of cash.
Scammers look for videos that have been uploaded online and need only a few seconds of audio to replicate how the target talks. They then call or send voicemails to friends and family, asking them to send money urgently.
Research released by the digital lender Starling Bank found that 28% of people had been targeted by an AI voice cloning scam at least once in the past year. However, 46% of people did not even know this type of scam exists, and 8% said they would be likely to send whatever money was requested, even if they thought the call from their loved one seemed strange.
Lisa Grahame, a chief information security officer at Starling Bank, said: “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters.”
The lender is now suggesting that people use a safe phrase with close friends and family to check whether a call is genuine.
“Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them,” Grahame said. “So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim.”
There is always a chance that safe words could be compromised. Anyone wary of any voice call or message could also call a trusted friend or family member to sense check the request, or call 159 to speak directly to their bank.
The UK’s cybersecurity agency said in January that AI was making it increasingly difficult to identify phishing messages, where users are tricked into handing over passwords or personal details.
These increasingly sophisticated scams have even managed to dupe big international businesses.
Hong Kong police began an investigation in February after an employee at an unnamed company claimed she had been duped into paying HK$200m (£20m) of her firm’s money to fraudsters in a deepfake video conference call impersonating senior officers of the company. The criminal is believed to have downloaded videos in advance and then used artificial intelligence to add fake voices to use in the video conference.
Lord Hanson, Home Office minister with responsibility for fraud, said: “AI presents incredible opportunities for industry, society and governments, but we must stay alert to the dangers, including AI-enabled fraud.”