Skip to content

Waymo self-driving cars are being investigated by the government for accidents, but recalls are the only enforcement power

The U.S. government’s Highway Traffic Safety Administration has opened another investigation into automated driving systems, this time into accidents involving Waymo self-driving vehicles.

The National Highway Traffic Safety Administration published documents detailing the investigation on its website early Tuesday after receiving 22 reports of Waymo vehicles either crashing or doing something that may have violated traffic laws.

In the past month, the agency has opened at least four investigations into vehicles that either drive themselves or can perform at least some driving functions, as it appears to become increasingly aggressive in regulating the devices.

In its investigation of Waymo, the former Google self-driving vehicle division, the agency said it had reports of 17 accidents and five other reports of possible traffic law violations. No injuries were reported.

In the accidents, the Waymo vehicles collided with stationary objects such as gates, chains or parked vehicles. According to the documents, some of the incidents occurred shortly after the Waymo ride system began behaving unexpectedly near traffic control devices.

Waymo said NHTSA plays an important role in road safety and will continue to work with the agency “as part of our mission to become the most trusted driver in the world.”

The company said it conducts over 50,000 rides weekly with drivers in demanding environments. “We are proud of our performance and safety record across tens of millions of autonomous miles driven, as well as our proven commitment to safety transparency,” the statement said.

Waymo, based in Mountain View, California, operates robotaxis without human safety drivers in Arizona and California.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said NHTSA’s more aggressive actions show that autonomous vehicles may not yet be ready for public roads.

The agency’s only enforcement authority regarding autonomous vehicles currently is to initiate investigations and seek recalls, which it does, Brooks said. NHTSA has been criticized in the past for its slow regulation Tesla and other companies that offer automated driving systems, but Brooks said things appear to have changed.

“Ultimately, I think it’s a good thing that they’re taking these steps and trying to figure out why these vehicles are behaving the way they are,” Brooks said.

NHTSA said it will examine the 22 incidents involving Waymo’s fifth-generation driving system, as well as similar scenarios, “to further assess any commonalities in these incidents.”

The agency said it believes Waymo’s automated driving system was activated in each incident or that, in some cases involving a test vehicle, a human driver deactivated the system shortly before the crash.

The probe will evaluate the system’s performance in detecting and responding to traffic control devices and in preventing accidents involving stationary and semi-stationary objects and vehicles, the documents said.

Since late April, NHTSA has opened investigations into collisions involving passing self-driving vehicles Amazon-owned Zooxas well as partially automated driver assistance systems from Tesla And ford.

In 2021, the agency required all companies with self-driving vehicles or partially automated systems to report all accidents to the government. The investigations rely heavily on data reported by automakers under this order.

NHTSA is also investigating General Motors’ Cruise autonomous vehicle unit after reports that the vehicles may not have exercised appropriate caution around pedestrians. cruise has recalled its cars to update the software after one of them dragged a pedestrian to the side of a road in San Francisco in early October.

The agency has also questioned whether a recall last year of Tesla’s Autopilot driver assistance system was effective enough to ensure human drivers were paying attention. NHTSA said it ultimately found 467 accidents involving Autopilot, resulting in 54 injuries and 14 deaths.

As part of the Ford investigation, the agency is investigating two overnight highway crashes that killed three people.

Also the agency urged Tesla to recall It introduced its “Full Self Driving” system last year because it can misbehave at intersections and doesn’t always obey speed limits.

Despite their names, neither Tesla’s Autopilot nor its “Full Self Driving” systems can control vehicles themselves, and the company says human drivers must be ready to intervene at any time.

Additionally, NHTSA has moved to establish performance standards automatic emergency braking systemsso they have to brake quickly to avoid pedestrians and other vehicles.

The standards are based on further research into automatic braking systems Tesla,Honda and Fisker because they can brake for no reason, increasing the risk of an accident.

In an interview in 2022, that is NHTSA Administrator Steven Cliff said the agency will increase inspections of automated vehicles, and the agency has recently taken additional measures. NHTSA has not had a Senate-confirmed administrator since Cliff moved to the California Air Resources board in August 2022.

Subscribe to the Eye on AI newsletter to stay up to date on how AI is shaping the future of business. Log in for free.