NewsLocal NewsInvestigations

Actions

What happens when a Waymo gets confused?  

Questions about safety and human intervention for autonomous rides
Posted
and last updated
Waymo.jpg

PHOENIX — As more cars with empty driver seats are hitting our streets, there are growing safety questions about how humans and robots are “sharing” the road.

As of the end of October, Waymo says it’s providing more than 150,000 paid trips weekly in its self-driving cars covering one million miles. That is triple the number of weekly rides compared to June 2024.

While Waymo has expanded to four cities, Phoenix has the largest operating area at 315 square miles.

As the company scales up, so do safety concerns and questions about how often humans intervene in ‘fully autonomous’ rides.

Waymo vs. semi

Truck driver Shabani Kwizera uses a loading dock next to a central Phoenix Waymo hub. He posted to TikTok about a semi crash with a Waymo last month in the driveway to the complex.

“[The tractor trailers] stop right here to check on the traffic,’ Kwizera said on the video panning the camera to the center of the driveway. “These Waymos came from the back and to the right side. The driver didn't see it by the time he was turning - the Waymo was on the blind side.”

Kwizera said the truck smashed the Waymo as it turned right. The autonomous car had black scrape marks across its front, driver-side panel. Based on what he witnessed, Kwizera said a human driver would have accommodated for the truck’s turning radius and blind spot.

“They're going to give us space so we can turn, but what happened with Waymo is they do not give us space,” he said. Phoenix police told ABC15 they investigated this crash.

Federal auto safety investigation

The National Highway Traffic Safety Administration is looking into 31 reports of crashes and alleged traffic violations involving Waymos. In letters to the self-driving car company last spring, federal safety regulators asked for extensive documentation. One NHTSA letter said the autonomous vehicles had “unexpected driving behaviors” that “may increase the risk of crash, property damage, and injury.” The letters also described “collisions with clearly visible objects that a competent driver would be expected to avoid.”

“The industry had been getting pretty much of a pass on all the annoyances and problems and loose ends,” said Phil Koopman, an associate professor at Carnegie Mellon University who’s studied self-driving car safety for 25 years.

He is closely watching the federal investigation and Waymo’s two voluntary recalls earlier this year after crashes in the Phoenix area.

“People could no longer say, well, nothing's going wrong. Leave us alone. You're getting in the way of progress,” said Koopman.

Waymo’s website says the company is “on a mission to be the world's most trusted driver.” The company has multiple analyses of vehicle safety. Waymo’s latest data showed the first 25 million miles of operation, showing the fully autonomous vehicles had 81% fewer airbag deployment crashes, fewer injury-causing crashes, and 57% fewer police-reported crashes compared to human driving the same distance in the cities where Waymo operates.

Koopman says statistically we may not know if Waymos are safer than human drivers until the self-driving cars hit one billion miles.

“We're going to start to see the rare events matter more and have more concern about the safety kind of events,” Koopman said. “What we've seen is they make robot mistakes.”

In social media videos now part of the federal investigation, Waymos appear to make unexpected maneuvers including cutting off a bus that had the right of way, swerving side to side behind a landscaping truck, driving into a closed construction zone, and blocking a major intersection.

Professor Koopman explained autonomous cars use machine learning based on examples.

“What if it sees something it doesn't know as an example?” Koopman said. “Not only does it not know what to do, it often has false confidence and just makes something up and just does something crazy.”

Waymo’s remote assistance

Joel Ricks Johnson, an Arizona YouTuber, said he has documented more than 170 Waymo rides. One video, which has more than 500,000 views, shows a Waymo “going rogue” when it encounters traffic cones.

“The car tried to make a right turn, and it just couldn't quite do it because there were cones in the lane,” Johnson said.

YouTube creator posts 170 videos about riding in Waymos

During the incident three years ago, Johnson said he called rider support from the back seat of the robo-car.

On its blog, Waymo describes its fleet response system as phone-a-friend. Employees use their computers to connect with the car and remotely check the on-board cameras and sensors. The car can prompt human remote assistance operators with multiple-choice questions to provide a better context of the situation. The remote assistance team can also give the vehicle a trajectory to follow.

Johnson said in his case, “It was kind of like humans fighting with robot, and then it actually just got stuck in high-speed traffic.”

Waymo then sent a human roadside assistance technician to the scene to take over driving, but as the technician approached, Johnson said the Waymo car started to drive away.

“Waymo tried to run,” Johnson said on the video. He estimates that roadside assistance has been called for his Waymo trips 4-5 times over 2,400 miles of travel.

“I don't know if I'm exactly representative of the normal, typical rider,” Johnson said.

The ABC15 Investigators emailed Waymo asking more about remote assistance operations and the frequency of human interventions.

The company declined to let us view its remote assistance operations.

A Waymo spokesperson would not provide information on how many humans help the cars explaining it’s not a strict ratio. Instead, he wrote, “Waymo sets expectations for service quality based on rapid response times, and we adjust and maintain the workforce necessary to meet those standards.” Waymo did not expound on those standards.

Johnson said Waymo has been “cagey about statistics” and it’s not entirely clear in all cases when human intervention is occurring on a robo-taxi ride.

“You can kind of tell sometimes if the car gets into a weird situation, there are little pointers here and there, like if the hazard lights turn on, if the steering looks weird, if the planned path gets really short,” Johnson said.

Johnson said overall he feels very safe.

“My mission is to document autonomous vehicle progress over time,” the YouTuber added. He said he pays for all his own rides.

Null

ABC15 is committed to finding the answers you need and holding those accountable.

Submit your news tip to Investigators@abc15.com

Waymo representative told ABC15 the autonomous driver does improve over time using artificial intelligence and operational improvements, and “it can solve more ambiguous scenarios independently and needs less help.”

A Waymo spokesperson also told ABC15 the company is complying with the NHTSA investigation.

NHTSA officials declined an interview request because the agency generally does not comment on an open investigation, but a spokesperson did send this statement:

"The National Highway Traffic Safety Administration’s approach to advanced vehicle technologies prioritizes safety across multiple areas including data collection and analysis, research, rulemaking, and enforcement.
  
"NHTSA has leveraged its authority in unprecedented ways to help assure the safety of vehicles with advanced technologies, including launching a first-of-its-kind Standing General Order requiring crash reporting and initiating a demonstration program designed to enhance public safety and transparency of ADS deployments. The Standing General Order also requires reporting of certain crashes involving vehicles with advanced driver assistance systems.

"NHTSA will continue to hold manufacturers accountable for any products that introduce an unreasonable risk to safety. The agency has opened multiple investigations into several manufacturers regarding potential safety defects in ADS systems, which have led to recalls of several ADS systems."

Do you have a self-driving vehicle video or story to share? You can reach ABC15 Senior Investigator Melissa Blasius by email at melissa.blasius@abc15.com or call 602-803-2506. Follow her on X (formerly Twitter) @MelissaBlasius or Facebook.