Mute calls used to clone the voice: how the new scam works and how to defend yourself

You receive an unknown call, answer it and there is only silence on the other end. Keep repeating «Hello, who is it?» a couple more times, and then you hang up thinking about a mistake, very quickly forgetting what happened. Yet, that handful of seconds could have an unexpected value for cybercriminals: it can be enough to confirm that your number is active and, much worse, to capture audio samples useful for cloning your voice using artificial intelligence. This technique, which combines old phishing strategies and new voice cloning tools, is increasingly used for online fraud and identity theft. Some security experts report that potentially just 3 seconds of recording could be enough to recreate a voice very similar to the original. From there, scammers can use the cloned voice for their own purposes, with very serious consequences both for the person whose voice was cloned and for his or her acquaintances and contacts. Given the danger of this particular hacking strategy, let’s see how the scam of silent calls used to clone the voice works and how to defend yourself.

How the voice cloning scam works with silent calls

It all begins, therefore, with a silent phone call. Scammers use automated systems that dial thousands of numbers per day. When you answer, even a simple background noise or a cough is enough to make it clear that that number is active and belongs to a real person. At that point, your contact is “marked” as active and entered into databases that circulate among various criminal networks. Some groups will use your number for further phishing attempts, others to sell the information to robocalling systems (automated calls for fraudulent purposes) or to those who intend to create voice clones.

The most insidious risk arises when your voice is recorded. AI voice cloning technologies are now so advanced that they can reproduce tone, rhythm and inflection with impressive realism. And all this starting from a few seconds of recording. Based on research conducted in 2023 by MSI-ACI in collaboration with McAfee and which took a sample of 7,000 people from 9 countries as a reference, just 3 seconds of audio would be enough to generate a clone with 85% similarity to the original voice and, even more alarmingly, with a few additional recordings the accuracy can exceed 95%. You understand well that, with these tools available, a scammer can create a voice message that sounds like that of a family member asking for help after an accident, an employee asking for an advance on a salary, etc.

These techniques are part of a broader phenomenon known as “spear phishing”, i.e. the attack aimed at a specific person using real information collected online which, clearly, makes the scam particularly credible, maximizing the effectiveness of the attack. Criminals often obtain personal details of their victims from the traces they leave online: posts, comments, tags or the location shared via the “Instagram Map”. This data serves to make the message more credible, for example by referring to a recent trip or a real family member.

According to McAfee data, 1 in 4 people have already had direct or indirect experience with voice clone scams, and 77% of victims have lost money because of them. 36% of those interviewed reported having suffered losses between 500 and 3,000 dollars and 7% of them reported damages of up to 15,000 dollars!

What makes the picture of the situation even darker is the ease with which effective voice cloning tools can now be accessed. Researchers have found more than a dozen free voice cloning programs online, many of which require minimal technical expertise to use. Some also allow you to reproduce accents from different languages ​​and regions, broadening the potential reach of attacks.

Once you have the clone, the next step is fraud. Fraudsters can call a bank by impersonating their victims, for example to request the reset of credentials or to authorize bank transfers, perhaps combining voice cloning with other techniques, such as spoofing (which allows the caller’s number to be disguised). Scammers can also use cloned voices to contact friends or family and ask them for money for made-up emergencies, and use the cloned voice as “audio evidence” in blackmail attempts or romance scams.

How to defend yourself from online scams based on voice cloning

Given the insidiousness of similar attacks, let’s see what defense actions to take to protect yourself from online scams based on voice cloning. Here are some points to always keep in mind.

  • Don’t answer unknown numbers: if someone calls you for legitimate reasons, they could leave you a message on your answering machine or they could try to contact you in other ways. If you decide to answer, stay silent and don’t say any words.
  • Activate the automatic blocking functions for unknown numbers: these are now present on all recently manufactured smartphones and allow you to mitigate the problem of scam calls, even if they don’t eliminate it completely.
  • Never share personal information over the phone: you should not do this even with anyone who claims to represent a reliable company or with “friendly voices” belonging to colleagues, family and acquaintances, since their voices may have been cloned themselves. If an interlocutor insists, end the conversation and call the organization or person in question by manually dialing their number on your smartphone’s dialer. Bonus tip: To protect your loved ones, you might also want to set up a safe word, which is a deadline you agree in advance to verify your identity in case of real emergencies.
  • Limit vocal content shared online: in the age of social media and instant messaging, this may not be advice applicable to everyone. And if you really can’t limit the sending or publishing of material that also contains your voice, try at least setting your social profiles to private mode.