As AI (artificial intelligence) becomes increasingly advanced and accessible, more and more people are adopting new tools to make their lives easier. Unfortunately, criminals are also twisting this technology for their own nefarious gains.
One such danger sees AI-powered deepfake technology used to clone a person’s voice in order to deceive and defraud unsuspecting targets.
The danger of voice cloning
There has been a spate of recent stories of fraudsters attempting to swindle money from victims using voice cloning AI technology.
Jennifer DeStefano received a phone call from her daughter in floods of tears and asking for help. On the call, her daughter claimed that she had been kidnapped, that she was in trouble and that she needed help. The phone was then passed to a male voice who warned Jennifer of violent consequences if a $1 million ransom wasn’t paid.
Jennifer instantly recognised the voice and thought nothing of believing that it truly was her daughter.
“It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried”.
Thankfully Jennifer was able to contact her husband while still on the call to the fraudster and he confirmed their daughter was safe.
In another case, TikTok personality Brooke Bush spoke of how she believed her brother had died in a car crash after her grandfather received a phone call supposedly from her brother asking for help. Brooke says the scammer impersonating her brother said that he had been in a car accident, that he had unintentionally killed somebody, was now in jail and that he needed money for bail.
In reality, the call was from a scammer attempting to steal money from her grandfather using AI voice cloning software to impersonate Brooke’s brother.
The social media personality warned: “If you guys ever get called and it’s someone asking for money that you know, they’re using a freaking AI machine to re-enact their voice.”
The new technology helping fraudsters
While scams like this have been around for a long time, they previously needed existing audio clips of a person’s voice which could then be pieced together like a jigsaw, often with very mixed results.
This new development in artificial intelligence means that fraudsters can realistically replicate the voice of basically anybody who has ever spoken for more than a few seconds on camera or radio and twist this into ammunition to impersonate and target family members, friends and colleagues.
According to one AI expert, voice cloning tech like this can replicate a voice form as little as three seconds worth of audio, however, the more audio available to sample, the more accurate the results can be.
In Jennifer DeStefano’s case, she claims that while her daughter does not have a public social media account, she does appear in public interviews for her school and for sports she is involved in, where criminals may have been able to capture her voice.
For children and adults who do have public social media videos, the ability for criminals to steal their voice is even greater thanks to the readily available material.
A natural progression for scammers
It’s common for criminals to use publicly available information to impersonate relatives and colleagues in order to swindle money from targets.
British entrepreneur Steven Bartlett shared a post on LinkedIn alongside Brooke Bush’s video, warning of the danger posed by this emerging scam method. He cautions that “this tactic will very likely be used against businesses. Scammers will clone the voice of a CEO, manager or director and use it to call clients, employees or partners and ask them to make cash withdrawals.”
How to spot an AI voice cloning scam
The incidents involving Jennifer DeStefano and Brooke Bush underscore the need for education and awareness of the dangers posed by deepfake technology.
Not all voice cloning scams will be as dramatic or as aggressive as those mentioned previously. Fraudsters may simply impersonate a family member or a colleague in order to ask for bank details, or gift cards for any number of seemingly legitimate reasons.
While there is a lot of overlap between regular impersonation scams and these deepfake scams, the key difference is that deepfake scams use the likeness of somebody you trust and care for, rather than simply impersonating an unknown bank employee, postal worker or utility provider.
Because of this, it’s worth knowing what to look for to stay safe when the person being impersonated is somebody you know well.
Ask questions
If you have any doubt whatsoever that the person you are speaking with may be fake, the best and easiest way to check is to ask questions. The more pressure you can put on the fraudster to answer questions, the better placed you will be to see through their disguise.
You should ask them something that can’t easily be found online, like the last time you met, or about a specific event you both attended.
Look for inconsistencies
This may be easier said than done, especially if the fraudster is pressuring you, but try to spot inconsistencies in the conversation. This could be as simple as realising the person being impersonated is acting completely different to how they normally do, or that they’re using words they wouldn’t usually use.
For example, if you go by a nickname, are they addressing you correctly? Are they using words that you wouldn’t expect them to use, or phrasing things in a strange way?
Take a moment
Fraudsters will try to pressure you and not allow you time to think or speak with other people, often using threats or claiming the need for help is urgent.
Try to find a way to delay the fraudster and tell them you need to phone back later. You could blame this on being on a train going through a tunnel, on somebody knocking at the front door, or simply that your phone is dying. Once you’re off the phone and you are no longer under time pressure you are in a better position to check that the call was genuine and to stay secure.
The rise of deepfake technology is truly a cause for concern. It has the potential to undermine trust not in legitimate organisations like HMRC and banks, etc. but also between friends and family members. As AI technology continues to evolve and improve, it is certain that we will see even more sophisticated deepfake face and voice scams in the future.