The data shows that Tesla accounts for 70% of accidents involving drivers.

How safe are automated driving systems? Are some safer than others?

Seven years after Tesla began selling cars equipped with what it calls autopilot, auto safety regulators still can’t answer these basic and vital questions.

But they took the step to do so on Wednesday with the National Highway Traffic Safety Administration. first crash report with advanced driver assistance systems.

The numbers are suggestive: Tesla accounts for 70% of all accidents involving “Level 2” driving systems, which include adaptive cruise control, automatic lane keeping, and may include more advanced features such as automatic lane change . This figure is sure to provoke critics who say that Elon Musk’s company has taken a reckless approach to introducing untested technologies.

But much more detail and context is needed before regulators can definitively say whether such systems can outperform human drivers or each other.

“These data may raise more questions than they answer,” NHTSA chief Stephen Cliff told reporters.

In June 2021, the agency required automakers to report serious accidents involving Tier 2 systems. The numbers released on Wednesday reflect accidents that occurred from then until May 15 this year.

Automakers reported that of all crashes involving all vehicles during this period, 392 involved automated driver assistance systems.

Of these, 273 were registered by Tesla, 90 by Honda and 10 by Subaru; others reported serious accidents in the single digits.

“These data provide limited insight into hundreds of accidents,” said Bryant Walker Smith, a professor of automated vehicle law at the University of South Carolina Law School. “But there were literally millions of other accidents in that same period.”

But no one should conclude that Tier 2 systems are safer than driver-driven cars, he said. They may be, they may not be. According to him, the NHTSA data is too extensive to draw such conclusions.

The data does not include the number of automated systems each company has on the road or the total number of miles driven by a vehicle with Tier 2 systems enabled. NHTSA did not comment on how thorough each company’s reporting procedures may be. The agency plans monthly reports.

Failures that were prevented by automated systems “obviously don’t register to the extent that they didn’t happen,” Smith said. A deep understanding of the causes of reported crashes – the role played by the system, the driver, the driver monitoring system and other conditions on the road – will help safety regulators come to firm conclusions, he said.

“What the NHTSA provided was a fruit bowl of data with a lot of caveats that made it hard for both the public and experts to understand what was being reported,” said Jennifer Homendy, chairman of the National Transportation Safety Board. . “Independent analysis of the data is key to identifying any security gaps and possible ways to address them.”

Last years failure reporting procedure marked the first attempt by the NHTSA to address a profound lack of knowledge about the real-world safety implications of automated vehicles on public roads.

An automated system from any vehicle manufacturer can be safer than human drivers. Or less safe. There is insufficient data to draw sound conclusions. Accident data collection systems in the US have been around for decades, are inconsistent, are still paper-based in many police departments, and are completely ill-equipped to determine the role of automated systems in preventing or causing accidents.

“One would hope that the NHTSA would ‘do the job’ to make the numbers they release in the bulletins really comparable,” Alain Kornhauser, head of the self-driving car program at Princeton University, said in an email.

In addition to collecting crash data, the NHTSA is investigating why Tesla cars crashed into emergency vehicles parked at the curb, often with flashing lights.

The investigation was triggered by 11 crashes that resulted in 17 injuries and one death, including three crashes in Southern California. The number of such accidents has increased to 16. The technology in about 830,000 vehicles — all Tesla vehicles sold in the US from 2014 to 2022 — is under investigation.

As part of this investigation, regulators will look into the operation of Tesla’s automatic emergency braking systems. Like time reported last yearTesla drivers report emergency braking problems much more often than drivers of other brands.

The ambulance investigation got more serious earlier this month when the NHTSA upgraded it to “EA” for engineering review. This category means investigators will take gaze on the technical design and performance of the autopilot. Once the investigation reaches EA, a recall will become more likely.

Meanwhile, the California Department of Motor Vehicles continues to investigate whether Tesla is touting its $12,000 full self-driving feature. The vast majority of experts in the field note that the system cannot even come close to driving safely.

However, the DMV review is over a year old and the DMV has not announced when it might be completed.

State lawmakers are increasingly concerned about the DMV’s seemingly lax approach to Tesla. In December, California Senate Transportation Committee Chair Lena Gonzalez petitioned the DMV to provide crash and security information to the committee. The DMV said they would look into it and are still looking into it.

It looks like the DMV is allowing Tesla to test self-driving cars on public roads. without requiring the company to report crashes or system failures, as required by competitors such as Waymo, Cruise, Argo and Zoox. DMV head Steve Gordon has been turning down all media requests to discuss the topic since May 2021.