What is an AI Voice Scam?
Before we discuss AI kidnapping scam, we must understand AI voice scams. An AI voice scam involves the use of artificial intelligence (AI) to mimic a person’s voice to trick the recipient of a call or voice message.
AI Voice Scam Leads to AI Kidnapping Scam
Ransom scams are not new, but artificial intelligence has made kidnapping scams much more believable. For as little as $5 a month, artificial intelligence (AI) programs enable con artists to clone voices to make them sound like their purported captives. The scam aims to play on the emotions of parents or relatives of the “victim,” creating a sense of urgency and panic that leads them to act without thinking and send money in exchange for their safety.
Understanding AI Voice Kidnapping Scam
Scammers use artificial intelligence and voice synthesis technology to create convincing voice messages that simulate a kidnapped loved one in distress. A survey by McAfee, a computer security software company, said it takes only three seconds of audio to replicate a person’s voice. They typically extract voice audio from public social media pages or similar websites and manipulate it into a recorded message. McAfee also found that 70% of people said they weren’t confident they could tell the difference between a cloned voice and the real thing.
How are people targeted for AI Voice Kidnapping Schemes?
Since this is a relatively new technology, there’s no substantial amount of data available on how many people are targeted annually by virtual kidnappers. However, based on recent cases, scammers often gather information from publicly available sources such as social media profiles to identify potential targets and their relationships.
How to Recognize an AI Voice Kidnapping Scam
Artificial Intelligence is not perfect and can’t make an exact clone of anyone’s voice (yet) so there are some signs to look out for to recognize a scam.
9 Ways to Spot AI Voice Scams:
- Unexpected Call or Message: If you receive a sudden phone call, voicemail, or message claiming that a loved one has been kidnapped, injured, or is in danger, be skeptical. Scammers often use surprise to catch you off guard.
- High Emotional Content: The message may contain cries for help, threats, and pleas for immediate action. Scammers play on your emotions, using urgent and distressing language to create panic.
- Request for Ransom: Scammers will ask you to pay a specific amount using untraceable methods like cryptocurrency, prepaid cards, or wire transfers.
- No Direct Contact: Scammers may claim that involving law enforcement or notifying others will result in harm to the supposed victim. This discourages you from seeking help or verifying the situation.
- Lack of Verification: Scammers avoid giving you time to verify the situation. They discourage you from making contact with the alleged victim through known contact information.
- Inconsistencies in the Story: Listen carefully for inconsistencies in the story. Scammers may not have accurate information about the victim or may make mistakes in their narrative. Ask as many questions as you can.
- Caller ID Spoofing: Scammers often use caller ID spoofing to make it appear as if the call is coming from a legitimate source. Don’t solely rely on caller ID to verify the authenticity of the call.
- Verify with Known Contacts: Before your act, verify the situation independently by reaching out to the person claimed to be in danger using their verified and legitimate contact details.
- Alert the Authorities: During virtual kidnapping, have another person call 911 and inform the FBI.
What about Snapchat Scams?
Not only do scammers extract data from Snapchat and other social media sites to feed AI programs voice audio to manipulate for their ransom schemes, but with Snapchat’s new integrated AI feature, My AI, users can customize the chatbot’s name, design a custom Bitmoji avatar for it, and bring it into conversations with friends. This creates a greater potential for exploitation of younger users, and parents are expressing concerns.
If your child is using the Snapchat My AI feature, be sure to discuss the importance of not sharing personal information with the chatbots. Even though chatbots may seem like any other friend on Snapchat, it is harvesting the data that it is given, and this data can be used for nefarious purposes.
Protecting Yourself from AI Voice Scams
While such stories are frightening, it’s not a Liam Neeson film. These incidents are rare, but surging. To minimize your risk of becoming a target:
- Educate Your Loved Ones: Warn your family and friends about these scams so they can stay vigilant and informed. Consider creating a family password (say it face-to-face, away from devices) that only you know and can use to verify on the phone.
- Don’t Share Personal Information: Never provide personal or financial information to unsolicited callers. Whenever you are answering unknown calls, be cautious. If you do answer, wait for the caller to speak first. This can help prevent voice cloning and manipulation risks.
- Review Privacy Settings: Share less online; protect personal details from scammers on social media.
If you need an expert to identify a scam, use Hogo’s ScamAssist tool to further investigate potential scams.
While technology has certainly made deepfakes a major concern, AI is not the problem. The problem is the criminals who are using it. Consequently, as technology advances, scammers also refine their tactics to exploit it. AI kidnapping voice message scams demonstrate the importance of being cautious and informed in our digital interactions. Undoubtedly, by staying alert and verifying information, you can protect yourself from scams and promote online safety.