Musk backed Tesla’s ‘self-driving’ car crash bragging, but data shows otherwise

Elon Musk has long been using his mighty Twitter bullhorn to reinforce the idea that Tesla’s automated driving software isn’t just safe — it’s safer than anything a human driver can achieve.

The campaign kicked off last fall when the electric car maker expanded its full self-driving “beta program” from a few thousand people to a fleet that now has over 100,000. The $12,000 feature purportedly allows Tesla to drive independently on highways and nearby streets, changing lanes, turning and obeying traffic signs and signals.

While Musk has been scolded by critics for testing experimental technologies on public roads without trained drivers as a safety net, Santa Monica investment manager and ardent Tesla supporter Ross Gerber was among the allies who came to his defense.

“There have been no accidents or injuries since the launch of the FSD beta,” he said. tweeted in January. “Not one. Not one.”

To which Musk replied in one word: “Right.”

In fact, by then, dozens of drivers had already filed safety complaints with the National Highway Traffic Safety Administration for self-driving incidents, and at least eight of them involved accidents. Complaints are publicly available database on the NHTSA website.

One driver reported that the FSD automatically “rushed straight to the semi-trailer” before accelerating to the middle post, causing the crash.

“The car went into the wrong lane” with FSD on, “and I was hit by another driver in the lane next to my car,” another said.

YouTube and Twitter are rife with videos exposing FSD misconduct, including a recent mail this appears to show Tesla heading into the path of an oncoming train. The driver jerks the steering wheel to prevent a head-on collision.

It is almost impossible for anyone other than Tesla to say how many FSD-related accidents, injuries, or deaths have occurred; The NHTSA is investigating several recent fatal accidents in which it may have been involved. The agency recently ordered automakers to report major crashes using automated and semi-automated technologies to the agency, but it has yet to release details of each crash.

Robot companies such as Cruise, Waymo, Argo, and Zoox are equipped with wireless software that immediately alerts the company to failures. Tesla pioneered such software in passenger cars, but the company, which does not have a media relations department, did not respond to questions about whether it receives automated crash reports from FSD vehicles. Automakers without wireless software must rely on public reports and communication with drivers and service centers to decide if an NHTSA report is needed.

Attempts to reach Musk also failed.

Gerber said he was unaware of the crash reports in the NHTSA database when he posted his tweet, but believed the company was aware of any clashes. “Because Tesla records everything that happens, Tesla is aware of every incident,” he said. He said that drivers might be at fault, but he didn’t look at the reports himself.

Accurate publicly available statistics on automatic car crashes do not currently exist because the police officers who make crash reports are guided only by driver statements. “We are not experts on how to get this kind of data,” said Amber Davis, spokeswoman for the California Highway Patrol. “In the end, we ask you to leave the best memories of how [a crash] happened. “

Exactly what data Tesla’s automated driving system collects and sends back to headquarters is known only to Tesla, said Mahmoud Hikmet, head of research and development at autonomous shuttle company Ohmio. He said that the definition of a Mask of an accident or an accident may be different from that of an insurance company or an ordinary person. NHTSA only requires crash reports for fully or partially automated vehicles if someone is injured, or an airbag has been deployed, or the vehicle needs to be towed.

Reports of the FSD crashes were first spotted by FSD critic Taylor Ogan, manager of China-focused hedge fund Snow Bull Capital. The Times separately downloaded and evaluated the data to verify Ogan’s findings.

The data covers the period from January to 2018. from January 1, 2021 to January. 16, 2022 – Showcase dozens of FSD safety complaints, including many reports of phantom braking, in which the vehicle’s automatic emergency braking system hits the brakes for no apparent reason.

The following are excerpts from eight accident reports in which FSD has been involved:

  • Southampton, New York: A Model 3 traveling at 60 mph collided with an SUV parked on the side of the highway. Tesla drove “right over the side of the SUV, ripping off the car mirror.” The driver called Tesla and said that “our car has gone crazy.”
  • Houston: The Model 3 was traveling at 35 mph “when suddenly the car jumped over the curb, damaging the bumper, damaging the wheel and blowing a tire.” The crash “apparently was caused by a discolored patch on the road, which gave the FSD a false impression of the obstacle it was trying to avoid.” Rejecting the warranty claim, the Tesla service center charged $2,332.37 and said it would not return the car until the bill was paid.
  • Placenta: “While turning left, the car entered the wrong lane and I was hit by another driver in the lane next to my car.” The car “on its own took control and swerved into the wrong lane… putting everyone involved at risk. The car was badly damaged on the driver’s side.”
  • Colletsville, North Carolina: “The road curved to the left, and when the car turned, it took too wide a turn and pulled off the road… The right side of the car lifted up and went beyond the beginning of the rocky slope. The front right wheel exploded and only the side airbags deployed (both sides). The car drove along the road for about 500 meters, after which it stalled. The estimated damage was between $28,000 and $30,000.
  • Troy, Missouri: Tesla was cornering when “suddenly, about 40% of the way around the corner, the Model Y straightened its wheel and crossed the center line in the straight path of an oncoming vehicle. When I tried to return the car to my lane, I lost control and drove into a ditch and through the forest, causing significant damage to the car.
  • Jackson, Missouri: The Model 3 “jerked to the right towards the semi-trailer, then tore left to the posts in the middle lane as it accelerated without the FSD disengaging. We had owned this car for 11 days when our accident happened.”
  • Hercules, California: “Phantom braking” caused the Tesla to suddenly stop, and “the car behind me didn’t respond.” The collision from behind resulted in “serious damage to the vehicle”.
  • Dallas: “I was driving with full driving assistance… the car was in my blind spot, so I tried to take control of the car by pulling on the steering wheel. The car gave an alarm indicating that I was going to crash into the left bridge. I believe I struggled with the car to regain control of the car and ended up hitting the left middle section which ricocheted.[ed] machine all the way to the right, hitting the median.

Critics say the name “Full Self Driving” is a misnomer and that no car available for sale to a private individual in the US can drive itself. FSD is “completely fantasy,” said NYU professor Meredith Broussard, author of Artificial Unreason, published by MIT Press. “And it’s a security nightmare.”
California regulations prohibit a company from advertising a car as fully self-driving if it isn’t. The state Department of Motor Vehicles is conducting a Tesla Marketing Review, which is now in its second year.

DMV head Steve Gordon has refused to speak publicly on the matter since May 2021. On Wednesday, the department said: “The review is ongoing. I’ll let you know when we have something to share.”