top of page
Admin

AI Dangers Of Cloned Voice Scams — What To Know

Published: April 29, 2024 on our newsletter Security Fraud News & Alerts Newsletter.



Scammers of all kinds are finding artificial intelligence (AI) an immensely helpful tool. In particular, they are having serious success with AI-enabled voice swindles. It only takes three seconds to clone a person’s voice and use it for a variety of scams, including a distressed “impostor” phone call to family or friends, to put these scams in motion. Knowing this, the more you can help keep out of the grips of AI voice-cloning scams with financial rip-off consequences.


Quick stats: These AI voice-cloning scams work so well that a recent McAfee survey finds 77% of victims ended up on the financial hook, with nearly one-third losing more than $1,000.


Tugging at Your Heartstrings


The actual victim is usually a relative of the caller who’s in financial trouble and needing money to fix whatever jam they are in. The range of problems varies, but tugging at heartstrings is always a tool that scammers count on to work. It can be as simple as paying a grandchild’s parking tickets to get their car back — and they don’t want their parents to find out. The dutiful grandparent shells out the payment in whatever form they’re told will work

.

The most simple and immediate way to ferret out a real phone call from an AI clone is having your family prepared ahead of time. Choose a particular phrase or key word to verify the troubled family member is whom they say they are, legitimizing the call — or not!



Tips For Keeping It Real


Below are tips for AI voice cloning scams and strategies to help determine if the call is indeed the real deal.


  • Toss up a red flag for any phone call requiring you to act fast and pressuring you for money, including those with a voice you recognize.

  • After receiving a call for financial help, hang up and call them back using their phone number saved on your device. Remember, caller IDs can be easily faked.

  • Beware of payment demands in unusual forms like gift cards, payment apps, wire transfers, and even cryptocurrency.

  • Don’t provide PII that can be used against you long after the scam is foiled.

  • Limit posting TMI on social media, especially clips with your voice. Only three seconds are needed for AI to clone your voice for a scam.


It’s easy to see how AI is helping scammers and cybercriminals get away with developing more sophisticated, successful attacks. You can bet AI voice-cloned scams will keep improving too, so buckle up and stay tuned.


Want to schedule a conversation? Please email us at advisor@nadicent.com

Kommentare


bottom of page