back to top

    AI Voice Cloning Scams: A Growing Threat, Warns Starling Bank

    A new form of fraud is emerging, using artificial intelligence (AI) to clone voices, posing a significant risk to millions, warns Starling Bank in the UK. Scammers are now leveraging AI to replicate a person’s voice from as little as three seconds of audio—often sourced from publicly available online videos.

    These fraudsters are then able to identify friends and family members of the individual, using the cloned voice to conduct phone scams requesting money. The potential for this type of fraud to impact millions is high, as noted in Starling Bank’s press release on Wednesday.

    According to a survey of over 3,000 adults conducted by the bank in partnership with Mortar Research, over a quarter of respondents reported being victims of AI voice-cloning fraud in the past year. Alarmingly, 46% of those surveyed were unaware such scams existed, while 8% said they would send money if requested by a friend or family member, even if the conversation felt unusual.

    Lisa Grahame, Starling Bank’s Director of Information Security, emphasized the danger, stating that people frequently post voice recordings online without realizing the potential risks this exposes them to. To combat these scams, Starling Bank advises families to establish a “safe phrase” for identity verification during phone calls. However, they also caution against sharing this phrase via text, as it could be intercepted by scammers.

    With AI becoming increasingly sophisticated at mimicking human voices, concerns are rising about its misuse, not just in financial fraud but also in spreading disinformation. Earlier this year, OpenAI, the creators of ChatGPT, introduced its Voice Engine tool, but withheld public release due to fears over potential misuse of synthetic voices.

    More in section

    2,216FansLike
    371FollowersFollow
    536FollowersFollow