Nick Giuliano's day started like any other until it took a terrifying turn when he received a phone call while out running errands.
"I was sick to my stomach... worst possible situation imaginable," he said.
While Giuliano doesn't usually answer numbers he doesn't recognize, he says he was expecting a call, so he picked up.
"It sounded just like my daughter... crying and saying, 'Papa help me!'" Giuliano recalled. His daughter often bikes to school. Giuliano thought she might have been in an accident but then a man got on the phone and told him she'd been kidnapped.
The man threatened Giuliano saying he'd need to bring $10,000 if he wanted to see his daughter again.
Giuliano hung up. He texted his daughter, called the police, and called her school. When he couldn't get through, he went to the school in person.
"I needed to actually talk to her and touch her and look at her with my own eyes to feel that she was going to be okay," he said.
Giuliano's daughter was safe. The call was a scam.
He says the little girl's voice on the phone wasn't just similar but sounded exactly like his daughter's voice. The voice, coupled with using the word "papa" made him believe it could really be his daughter on the line.
It's all possible with the use of artificial intelligence voice-cloning.
"You give them text to speak and then they speak in a particular emotion and voice," explained Professor Subbarao Kambhampati with the School of AI at ASU.
Voice-cloning technology is becoming widely available through websites and apps offering services for a subscription price, some even for free.
"People can take just a few seconds of your voice and then start to have the system speak in your voice," Professor Kambhampati said.
Where are scammers and crooks getting your voice samples? Well, your audio may already be available.
"You've written stuff, you've tweeted stuff, you've said things that are in family videos - everything is online," said Professor Kambhampati, adding the better the technology the more quickly and with less audio samples a voice can be cloned.
Scammers are using AI in their schemes so often, the Federal Trade Commission put out a consumer alert.
"All [the scammer] needs is a short audio clip of your family member's voice, which he could get from content posted online, and a voice-cloning program," the commission warned. "When the scammer calls you, he'll sound just like your loved one."
Professor Kambhampati believes scam-blocking technology will be developed, like third-party authentication, to weed out voice-cloned calls, but we're not there yet.
In the meantime, you can take steps to protect yourself against voice-cloning scams:
- Be cautious of unsolicited calls or messages. If you receive a suspicious call - hang up! Do not call the number back. Instead, contact your loved one directly.
- Use a code word. Set up code words or questions that only you and your loved ones know to help determine if a situation is real or fake.
- Be careful with your personal information. Limit the content and information you share online — especially videos — and limit the amount of information you make public. Opting for private profiles can limit who sees what you post to family and friends.
"We can be scammed on anything where we depended on how somebody sounds, how somebody looks, how somebody writes, because all of these can be generated by AI systems," Professor Kambhampati.
AI is also being used to make phishing scams more believable. AI programs can be used to send more targeted, personal messages and chatbots allow scammers to engage with many targets at one time.