NewsLocal News

Actions

AZ Attorney General releases warning on AI scams

Posted
and last updated

PHOENIX — State Attorney General Kris Mayes has released a new warning to Arizonans about scams using artificial intelligence.

ABC15 has been reporting on AI scams for months, but as more have become common others are sharing their experiences.

Payton Bock tells ABC15 says her parents received a terrifying call in 2021.

“We just talked about it, and she still says it some of the scariest days of her life,” said Bock.

Bock says the scammers told her parents there had been a car accident.

“They were being really aggressive to my mom and my dad,” said Bock.

The people on the phone demanded money.

“Then they said that they were going to kill me, and they had me in the back of the truck,” said Bock.

Her mom asked to speak with her daughter over the phone and was surprised to hear the voice on the other end.

“It was me and my voice,” said Bock.

She told ABC15 the voice sounded like she was begging her parents for help.

With the help of Phoenix police, her parents were able to verify that it was not Bock on the phone and that she was safe at work.

Bock said after seeing another Arizona family go through something similar, she made a TikTok warning others.

“I was like I just need to let people know that this is real and it’s happening,” said Bock.

Just a few seconds of someone's voice is all scammers need. They can even do things like add emotion.

Bock said it’s easier than people think to be fooled.

“When you are in a situation where you get a phone call that someone is holding your significant other hostage, and it sounds exactly like them you're not going to think normally or clearly,” said Bock.

ABC15' Let Joe Know team spoke with experts at ASU about how easily voice samples can be found.

“You have written stuff, you have tweeted stuff, you have sent things, there are family videos, everything is online,” said Professor Subbarao Kambhampati.

Bock now shares her location with her parents to make sure they know where she’s at.

“Code words are huge,” said Bock.

That's something the AG’s office also recommended.

The Attorney General provides the following tips to protect yourself from voice-clone AI scams:

  • Beware of any emergency call asking for money to be sent right away.  
  • Don’t trust the voice or message as voices can be imitated with AI.  
  • Hang up and call your loved one through a trusted number to verify the call or text. 
  • Beware of high-pressure scare tactics. 
  • Beware of requests for payments through gift cards, person-to-person pay apps, etc.  

Attorney General Kris Mayes shared a statement:
“Scammers are using AI technology to personalize scams and mimic a loved one’s voice—or to send similar personalized text messages—to trick people,” said Attorney General Mayes. “Receiving a call from a loved one in distress, with a voice that appears to be real, can easily push a consumer into rushing to send money or other forms of payment. Be wary of any call asking for emergency money. Contact the family member who is supposed to be calling to verify the ask – and always seek help from others, including law enforcement, before sending any form of payment.”