AI Voice Cloning Scam Targets Elderly with Fake Direct Debits
AI Voice Scam Targets Elderly for Direct Debits

Urgent Warning Over AI Voice Cloning Scam Targeting Older People

National Trading Standards has issued an urgent public alert about a sophisticated new phone scam where criminals are using artificial intelligence to clone people's voices and establish unauthorised direct debits from their bank accounts.

How the AI Voice Scam Operates

The fraudulent scheme begins with criminals making initial contact through what they describe as 'lifestyle survey' phone calls. During these conversations, fraudsters gather comprehensive personal information from their targets, including health details, financial circumstances, and other sensitive data that can be exploited.

Once sufficient personal information has been collected, the criminals use advanced AI technology to create convincing voice clones of their victims. These artificially generated voice recordings are then presented to financial institutions as supposed 'proof' of consent for setting up direct debit arrangements.

Elderly Population Particularly Vulnerable

National Trading Standards has indicated that this particular scam appears to be deliberately targeting older individuals, who may be less familiar with emerging technologies and more trusting of telephone communications. The organisation has expressed particular concern about the psychological impact on victims, many of whom remain completely unaware that they are being defrauded until they notice unusual transactions on their bank statements.

Protective Measures Recommended

The public is being urged to take several precautionary measures to protect themselves and their loved ones from this evolving threat:

  • Regularly discuss scam phone calls with friends and family members, particularly elderly relatives who may be more vulnerable
  • Conduct frequent and thorough checks of bank statements for any unauthorised transactions
  • Report any suspicious activity or unrecognised direct debits immediately to both banks and relevant authorities
  • Exercise extreme caution when sharing personal information during unsolicited telephone calls
  • Verify the legitimacy of any organisation requesting financial authorisation through independent channels

This warning comes as technological advancements in artificial intelligence make voice cloning increasingly accessible to criminals, creating new challenges for fraud prevention agencies and financial institutions across the country.