The terrifying reality of a growing scam where fraudsters now use AI-generated voice clones to impersonate relatives and extort money from unsuspecting victims is rapidly gaining momentum. With just a short audio clip pulled from your social media posts or past phone calls, scammers can mimic your voice convincingly.
In today’s digital world, deepfake technology and voice cloning are rapidly evolving, courtesy of artificial intelligence (AI). With increasing innovation in AI, any misuse could result into manipulation of digital content and thus, leading to fraud or loss of and damage to personal or organisational reputation.
“Deepfake and voice cloning are interconnected yet distinct concepts,” says Mert Çıkan, Product Owner at SESTEK.
“Deepfake technology involves employing advanced AI algorithms to produce media—videos, audios, images, and text—that deceptively appear realistic, despite being fabricated or altered. Voice cloning, on the other hand, is a unique subset within deepfake technology, focusing on audio manipulation. With this technique audio content can be synthesized that sounds like a specific individual,” he says.
In recent years, there have been multiple cases of audio deepfakes and voice cloning incidents in Nigeria and across the world.
For instance, in April 2023, an audio clip surfaced online purportedly featuring Labour Party presidential candidate, Peter Obi, referring to the 2023 election as a “religious war” in an alleged conversation with the presiding Bishop of Winners Chapel International, David Oyedepo. Obi denied the authenticity of the “Yes Daddy” audio leak, labeling it as “fake” and “doctored,” and suggested it was an attempt to discredit him prior to Nigeria’s 2023 general election.
READ THIS: How to protect yourself from online scams – Study
Another audio deepfake circulated online, allegedly capturing a conversation between former Nigerian president, Olusegun Obasanjo; Nigerian musician, Charly Boy and former Cross River State governor, Donald Duke. The audio clip suggested that Obasanjo was urging protests against the 2023 election results. However, a fact-check by TheCable concluded that the audio was doctored and did not meet the authenticity threshold when compared with verified samples of Obasanjo’s voice.
On the global scene, a man in Los Angeles, United States was swindled out of $25,000 after fraudsters used AI to replicate his son’s voice, convincing him of a fabricated emergency.
A 2023 Guardian newspaper report revealed that AI scams now explore cloning to defraud consumers, following a 264% increase in attacks. The report quoted the Southern African Fraud Prevention Service (SAFPS) saying that impersonation attacks increased by 264 per cent for the first five months of 2023 compared to 2021. The report also highlighted the increasing use of AI in fraudulent activities, noting that “cybercriminals are leveraging Artificial Intelligence (AI) through cloning to defraud unsuspecting consumers.”
The FactCheckHub spoke with at least two leading experts combatting information disorder in Africa – Lee Nwiti, Chief Editor at Africa Check in South Africa and Silas Jonathan, Digital Investigation Manager at the Digital Technology, Artificial Intelligence, and Information Disorder Analysis Centre (DAIDAC) in Nigeria.
“Scammers now use AI to clone voices with chilling accuracy — impersonating friends, family, or colleagues to trick victims into handing over money or sensitive information,” Nwiti warns.
DON’T MISS THIS: Five tools to detect audio deepfakes
But as concerns grow over the misuse of AI in everyday life, Jonathan emphasizes that awareness is the first and most crucial step in protecting oneself from AI-powered voice cloning scams.
“You can’t protect yourself from something you don’t even know exists,” he warns, highlighting the urgent need for public education on the realities of voice cloning technology. “The first thing people need to know is that there is a possibility that their voices can now be cloned,” he added.
Safety Tips:
However, they suggested some ways to protect yourself and your loved ones from falling victim to this new form of scam and audio deepfakes as follow:
1. Know that voice cloning and deepfakes are real
AI voice cloning scams are no longer science fiction. With just a few voice samples often pulled from social media videos, voice notes, or phone calls, scammers can create convincing voice replicas that sound just like someone you know. You can’t guard against what you don’t know exists. Voice cloning is real, so is audio deepfakes – beware!
2. Watch out for emotional manipulation
Fraudsters typically create a sense of urgency — for example, claiming a loved one has been in an accident or kidnapped — and then demand immediate money transfers. Always pause before reacting.
3. Evaluate the audio or voice thoroughly
Before you interpret any kind of alleged audio leak or phone call… it is very important to ask yourself these questions: “Now that I’m aware that voices can be cloned, is this true? Is the alleged person likely to say something like that?” Always question the authenticity of controversial or unexpected audio leaks or strange phone calls.
4. Verify before you act
Don’t trust the voice or audio clip alone. Call the person(s) involved using their regular phone numbers, or contact any close friend or family member nearer to the person(s) to verify the situation. If it doesn’t check out, it’s likely a scam.
ALSO READ: AI voice technology used to create misleading videos on TikTok – Report
Also, you can check online to see if there is some news around the incident from credible media outlets. You may also search online for corroborating evidence or news reports that support or refute the audio.
5. Consult experts
It’s really good to share such audio clip or phony call recording with experts or fact-checking organisations like FactCheckHub, Africa Check, DAIDAC or Dubawa, among others. Fact-checkers have the tools and expertise to analyse such audio and are skilled to provide guidance (or caution) on the authenticity or otherwise of the clip or recording. You may also submit it to The FactCheckHub team via our mobile app to help you verify.
6. Avoid hasty conclusions
Dealing with audio can be very tricky because it’s not visual. Sometimes you have nothing to prove or to show. Be cautious with audio-only evidence; lack of visuals makes it easier to manipulate.
7. Set up family code words
Agree on secret phrases or code words with close family and friends that only you would know. If you ever receive a suspicious voice message or audio clip or phone call, ask for the code word before continuing the conversation.
8. Practice good digital hygiene
Limit the amount of personal voice content you share online. Be mindful of what you post publicly, and avoid oversharing details that could be used to make cloned phone calls or audio clip more convincing.
9. Report scams to appropriate local authorities
Even if you didn’t lose money, report the incident, scam or audio leak to the appropriate local authorities. For Nigerians, report such scams to the Nigeria Police Force-National Cybercrime Centre (NPF-NCCC) via their official X account or the Police HQ cybercrime unit, or the National Information Technology Development Agency (NITDA) through their Computer Emergency Readiness and Response Team (ngCERT) portal.
These voice-cloning scams and audio deepfakes are sophisticated, but awareness and preparation can help you stay ahead. Talk to your friends and family about these tactics. Set up a plan. And remember: if it feels like a scam or too familiar voice, it probably is phony. Hang up and verify.
