AI-Enabled Voice Scams Rising Rapidly In India: Money Extortion With Fake Calls Impersonating Relatives
Artificial Intelligence (AI) technology is fueling the rise of online scams. One of the most notorious scams involves cloning a person's voice and then targeting a victim via a phone call. Indian mobile users are an increasingly preferred target, revealed a survey.
An overwhelming number of mobile phone users surveyed from India reported they have lost money to scams involving calls with faked voices of relatives, friends, or acquaintances. Let's look at the concerning new scam which needs a few seconds of audio to impersonate someone by faking their voice.

Indians Being Scammed Using Voices Faked By AI
OpenAI's ChatGPT, Google's Bard, and Microsoft's Dall-E are just a few popular AI-enabled conversational and imaginative AI tools. There are dozens of other AI-enabled tools that offer a wide range of features and functions. Quite a few platforms allow changing voices, and some of these are being used to run scams.
Online security firm McAfee recently published a report which included a survey of 7,054 people from seven countries, including 1,010 respondents from India. It mainly dealt with Artificial Intelligence-enabled voice scams by imposters.
Identity theft and impersonation are rampant across the world. However, Indian mobile phone users are increasingly being subjected to voice scams, suggested the report.
"About half (47 percent) of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average (25 percent). 83 percent of Indian victims said they had a loss of money - with 48 percent losing over Rs. 50,000"
How Can Indians Shield Themselves From AI-Enabled Voice Scams?
According to the report, "more than half (69 percent) of Indians think they don't know or cannot tell the difference between an AI voice and a real voice." Simply put, from the surveyed users, an overwhelming number claimed they could be fooled by a call that appeared to come from a friend, family member, or acquaintance.
AI engines have become quite potent. There are a few platforms that can clone a person's voice with just three seconds of audio. This significantly raises the possibility and success rate of scamming unsuspecting mobile phone users.
A convincing conversation that appears to be from a known person can allow scammers to easily pull off a scam with large amounts. Hence, the report suggests using a verbal codeword among family members and trusted close friends. This codeword could be used to verify the identity of the person. Additionally, a video call could also help confirm the identity of the person requesting money.


Click it and Unblock the Notifications








