The federal traffic safety investigators want Tesla to tell them how and why the problem was fixed as part of a recall of more than 2 million vehicles equipped with the company’s Autopilot semi-automated driving system.
Investigators at the U.S. National Highway Traffic Safety Administration have concerns about whether the recall worked because Tesla did so reported 20 crashes there was the cure sent as an online software update In December.
The recall solution was also about clarifying the question of whether the autopilot can also be used on roads other than highways with restricted access. The solution to this was increased warnings for the driver on roads with intersections.
But in one Letter to Tesla published on the agency’s website On Tuesday, investigators wrote that they could find no difference between warnings to drivers before the recall and after the new software was released. The agency said it will consider whether driver warnings are appropriate, particularly when a driver monitoring camera is covered.
The agency requested extensive information about how Tesla developed the solution, focusing on how the company used human behavior to test the effectiveness of the recall.
“Inadequate remedial measures”
Phil Koopman, a professor at Carnegie Mellon University who studies automated driving safety, said the letter shows the recall did little to resolve problems with Autopilot and was an attempt to reassure NHTSA. which requested the recall after more than two years of investigation.
“It’s pretty clear to anyone watching that Tesla has tried to provide as little remedial action as possible to see what they can get away with,” Koopman said. “And NHTSA must respond forcefully or other automakers will begin offering inadequate remedies.”
Safety advocates have long expressed concern that Autopilot, which can keep a vehicle in its lane and maintain a distance from objects in front of it, is not designed to operate on roads other than limited-access highways.
Missy Cummings, a professor of engineering and computer science at George Mason University who studies automated vehicles, said NHTSA is responding to criticism from lawmakers over a perceived lack of action on automated vehicles.
“As cumbersome as our government is, the feedback loop works,” Cummings said. “I think NHTSA leadership now believes this is a problem.”
The 18-page NHTSA letter asks how Tesla used the science of human behavior in developing Autopilot and how the company views the importance of evaluating human factors.
Tesla should also identify each workplace that is involved in evaluating human behavior and the qualifications of the workers. And it calls on Tesla to say whether the positions still exist.
The Associated Press left a message early Tuesday seeking comment from Tesla about the letter.
Tesla is there laid off around 10% of its workforceabout 14,000 people in an effort to reduce the cost of coping declining global sales.
Cummings said she suspected that CEO Elon Musk would have fired anyone with knowledge of human behavior – a key skill needed to operate semi-automated systems like Autopilot, which cannot drive themselves and require humans to be ready to intervene at all times .
“If you want to have a technology based on human interaction, you better have someone on your team who knows what they are doing in this area,” she said.
Cummings said her research showed that once a propulsion system takes over steering from humans, there is little left for the human brain to do. Many drivers tend to rely too much on the system and check out.
“You can fix your head in one position, potentially keep your eyes on the road and be a million miles away in your mind,” she said. “All the driver monitoring technology in the world still won’t force you to pay attention.”
Is the autopilot on or off?
In its letter, NHTSA also asks Tesla for information about how the recall remedy will address driver confusion about whether Autopilot has been turned off when force is applied to the steering wheel. Previously, if Autopilot was disabled, the driver may not quickly realize that they need to take over driving.
The recall added a feature that allows for “stronger deceleration” to alert drivers when Autopilot has been disabled. However, the recall does not activate the function automatically; the driver has to do it himself. Investigators asked how many drivers had taken this step.
NHTSA asks Tesla, “What do you mean you have a cure and it doesn’t actually turn on?” Koopman said.
The letter, he said, shows that NHTSA is checking whether Tesla conducted tests to ensure the fixes actually worked. “When I looked at it, I couldn’t believe that there was a lot of analysis showing that it improved safety,” Koopman said.
The agency also says Tesla made safety updates after the recall was issued, including an attempt to reduce accidents caused by hydroplaning and collisions in high-speed turning lanes. NHTSA said it would investigate why Tesla did not include the updates in the original recall.
Safety experts say NHTSA could seek additional recalls, limit Tesla’s use of Autopilot, or even force the company to disable the system until the problem is resolved.
NHTSA began its Autopilot investigation in 2021 after receiving 11 reports that Teslas using Autopilot had struck parked emergency vehicles. In documents explaining why the investigation was closed because of the recall, NHTSA said it ultimately found 467 accidents involving Autopilot, resulting in 54 injuries and 14 deaths.