Artificial Intelligence Makes Voice Cloning Scams More Convincing; Don’t Panic

Date: 03/05/2024
  • Voice cloning scams have been around for years. However, the scams continue to evolve as artificial intelligence (AI) improves and becomes more accessible.
  • A recent high-profile voice cloning scam happened in New Hampshire when residents received calls claiming to be U.S. President Joe Biden asking voters not to vote.
  • More common voice cloning scams claim to be from family members, friends or co-workers to create panic in hopes victims will give away money or personal information. There are many different variations of the scam, including kidnapping scams.
  • AI technology has become so advanced that criminals use it to search the internet for information about people and businesses to target victims. AI tools can also add sounds like laughter, fear and other emotions into the cloned voice, as well as sound effects like a subway station, an airport or a car crash to make the scam more convincing.
  • Don’t panic. The likelihood of an individual being targeted personally is low but higher for businesses and their employees. If you receive a suspicious call, hang up and contact the person directly. If you cannot reach them, contact them through family, friends or co-workers.
  • To learn more about voice cloning scams, or if you believe you have been the victim of an identity crime, contact the Identity Theft Resource Center toll-free by phone (888.400.5530) or live chat on the company website, idtheftcenter.org.

With the rise of artificial intelligence (AI)-fueled scams, how do you know if the person on the other end of a phone call is a friend or foe? Voice cloning scams have been around for years. However, the cons continue to evolve as AI improves and becomes more accessible and easier to use.

A recent high-profile voice cloning scam happened when New Hampshire residents received AI-generated robocalls mimicking U.S. President Joe Biden’s voice asking voters to “save your vote for the November election.” A more extreme example involves a business in Asia where cybercriminals used real-time voice cloning and deep fake video technology to stage a Zoom call where executives instructed a team member to wire $25 million.

Not only can AI be used to clone celebrities and public figures, but it can also be used in everyday voice cloning scams that claim to be from friends, family members or co-workers. These AI-fueled scams have many variations, ranging from election misinformation to kidnapping for ransom scams.

Who Are the Targets?

Identity criminals are not interested in attacking individuals unless they have a high net worth or they are an employee with access to business information or systems. Cybercriminals prefer to launch attacks they can automate and can be used against large groups of individuals – like automated phishing attacks. Voice cloning scams are the exact opposite of an automated attack on a large scale. That’s why businesses are more likely to be targeted, along with people with a high profile and obvious financial resources.

Identity criminals increasingly find their targets using AI programs to search the internet for information about people and businesses, including audio or video posts on social media or the web, as well as for details that can be used to make compelling calls to victims.

What is the Scam?

Scammers use AI tools to clone the voices of individuals they target on social media or the web to make calls to family, friends or co-workers. AI tools require as little as three seconds of a voice to create a realistic (enough) clone. Criminals can also spoof a phone number so it looks like a known caller. Using AI tools, criminals add sounds like laughter, fear and other emotions into the cloned voice, as well as sound effects like a subway station, an airport or a car crash. The technology is so advanced that scammers can also add accents and age ranges.

What they Want

Just like in traditional “Grandparent” or “Business Email Compromise (BEC)” scams, cybercriminals use a variety of tactics to create a sense of panic or urgency – like claiming a loved one is in danger, or an important vendor must be paid NOW! The criminals hope to scare people into sending money or sharing business or personal information that can be used in another identity crime.

How to Avoid Voice Cloning Scams

  • Hang up and don’t panic. Bad actors count on your fear or sense of duty to get you to take an action you otherwise would not. If you have any doubt about who is calling you and what they are asking you to do, hang up, collect your thoughts, and contact the person who supposedly called you and verify the situation. If you cannot reach them, connect with them through other family, friends or co-workers (if the call is business-related).
  • Be vigilant on all phone calls, even if you recognize the voice. Listen for odd statements, questions, or requests, especially for money or personal or business information. If you think something might not be right, ask questions that only the real person would know.
  • Avoid personalized voicemail messages. They can give bad actors easy access to your voice. Instead, use automated tools offered on mobile devices and office phone systems.

Contact the ITRC

If you have additional questions about voice cloning scams or believe you were the victim of an identity crime, contact us. You can speak with an Identity Theft Resource Center expert advisor toll-free by phone (888.400.5530) or live chat on the company website. Just visit www.idtheftcenter.org to get started.

This post was originally published on 3/2/20 and was updated on 3/5/24

How much information are you putting out there? It’s probably too much. To help you stop sharing Too Much Information, sign up for the In the Loop.

Get ID Theft News

Stay informed with alerts, newsletters, and notifications from the Identity Theft Resource Center