AI Voice-Cloning Scams Could Target Millions, Warns Starling Bank

Millions of people may fall victim to AI voice-cloning scams, according to Starling Bank. The UK-based online bank warns that fraudsters can replicate a person’s voice using just three seconds of audio, potentially sourced from social media videos. These cloned voices can then be used to deceive friends and family into sending money.

The bank’s recent survey of over 3,000 adults revealed that more than a quarter have been targeted by such scams in the past year. Alarmingly, 46% of respondents were unaware these scams existed, and 8% would send money to a suspicious call from someone they believed to be a loved one.

Lisa Grahame, chief information security officer at Starling Bank, emphasized the risks associated with publicly sharing voice recordings. To combat this, the bank recommends that individuals establish a “safe phrase” with their close contacts, which can serve as a verification method during phone conversations.

Starling Bank advises against sharing this safe phrase through text, suggesting it should be communicated in person or over the phone, and deleted from messages afterward to prevent exposure. As AI technology advances, the potential for misuse in scams and misinformation continues to grow, raising significant concerns for consumers.