September 20, 2024
3 mins read

Starling Bank warns of AI voice cloning scams  

The online-only lender said that these scams are highly effective, with millions of people potentially at risk…reports Asian Lite News

The Starling Bank, has issued a warning about a new wave of scams that use artificial intelligence to replicate people’s voices. Fraudsters can create convincing voice clones from just a few seconds of audio, often found in online videos, the bank said in a press release.

The online-only lender said that these scams are highly effective, with millions of people potentially at risk. The bank’s survey found that over a quarter of respondents had been targeted by such scams in the past year, and many were unaware of the threat, CNN reported.

“People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters,” Lisa Grahame, chief information security officer at Starling Bank, said in the press release.

According to the survey, 46% of respondents weren’t aware that such scams existed, and that 8% would send over as much money as requested by a friend or family member, even if they thought the call seemed strange.

To protect themselves, people are advised to establish a “safe phrase” with their loved ones. This unique phrase can be used to verify identity during phone calls. The bank advised against sharing the safe phrase over text, which could make it easier for scammers to find out, but, if shared in this day, the message should be deleted once the other person has seen it. 

As AI technology continues to advance, concerns about its potential for misuse are growing. The creators of ChatGPT, and OpenAI, have even acknowledged the risks associated with voice replication tools.

Lisa Grahame, Chief Information Security Officer at Starling Bank, commented, “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters. Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim”.

“We hope that through campaigns such as this we can arm the public with the information they need to keep themselves safe. Simply having a Safe Phrase in place with trusted friends and family – which you never share digitally – is a quick and easy way to ensure you can verify who is on the other end of the phone”.

When prompted as to what AI voice cloning scams entail, 79% of UK adults reported being concerned about being targeted – more so than HMRC / High Court impersonations scams (75%), social media impersonation scams (76%), investment scams (70%) or safe account scams (73%).

Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, said: “AI presents incredible opportunities for industry, society and governments but we must stay alert to the dangers, including AI-enabled fraud. As part of our commitment to working with industry and other partners, we are delighted to support initiatives such as this through the Stop! Think Fraud campaign and provide the public with practical advice about how to stay protected from this appalling crime.”

To launch the campaign, Starling Bank has recruited leading actor, James Nesbitt, to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed.

Commenting on the campaign, Nesbitt said “I think I have a pretty distinctive voice, and it’s core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes (and ears) to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a Safe Phrase with my own family and friends.”

ALSO READ: Govt urged to scrap ‘racist’ visa route

Previous Story

Fintech major Revolut gearing up for India launch in 2025  

Next Story

Israeli jets rain bombs on Lebanon

Latest from -Top News

G7 Summit Eyes Energy Security

Monday’s schedule includes a 90-minute session among G7 leaders to discuss the global economic outlook…reports Asian Lite News The Group of Seven (G7) summit unveiled its slimmed-down agenda, prioritising discussions on the

25th Custodial Death Under Yunus Rule

The authorities of the Central Jail at Keraniganj claimed that the Awami League leader died by suicide….reports Asian Lite News Another leader of Bangladesh’s Awami League has died in prison custody, becoming

Non-Oil Boom Fuels UAE Growth

In terms of contribution to non-oil GDP, the trade sector ranked first with 16.8 per cent, followed by manufacturing (13.5 per cent)…reports Asian Lite News The United Arab Emirates (UAE) posted a

Modi Mania Grips Canada

This is PM Modi’s first visit to Canada after a year marked by diplomatic tensions…reports Asian Lite News As Prime Minister Narendra Modi gears up for his visit to Canada for the

MI6 Gets First Female Chief

Metreweli, 47, will become the first woman to lead MI6 in its 116-year history….reports Asian Lite News Underscoring the rising role of cyber intelligence in modern espionage, Prime Minister Keir Starmer has
Go toTop

Don't Miss

‘France not a hostage to UK’s migration policy’

The French Minister accused the UK of closing legal ways

UK exporters to South Korea tie up deals worth ‘tens of millions’

Trade between our two nations increased by 6% to total