Tesla sedan in Autopilot mode collides with parked police vehicle in Laguna Beach

A Tesla sedan running on Autopilot mode collided with a parked vehicle belonging to the Laguna Beach Police Department today. No one was in the police car and the Tesla driver had only minor injuries, reports the Los Angeles Times, but the police car was “totaled,” Laguna Beach police sergeant Jim Cota told the newspaper.

Cota also said that a year ago, there had been another incident involving a Tesla colliding with a semi-truck in the same area. “Why do these vehicles keep doing that?” he told the LA Times. “We’re just lucky that people aren’t getting injured.”

In an emailed statement, a Tesla spokesperson said:

“When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and before a driver can use Autopilot, they must accept a dialogue box which states that ‘Autopilot is designed for use on highways that have a center divider and clear lane markings.’”

This is the latest in several accidents involving a Tesla vehicle in Autopilot mode. These include a crash in Utah earlier this month that occurred while the driver was looking at her phone and two fatal crashes, one in California two months ago and another in 2016 that happened in Florida.

Launched in late 2015, Tesla’s Autopilot feature is meant to “relieve drivers of the most tedious and potentially dangerous aspects of road travel” and includes standard safety features like automatic emergency braking and collision warnings, but it is not meant to replace navigation by a human driver.

Despite Tesla’s instructions to drivers before they start using Autopilot, the automaker’s critics have called on the company to disable the feature until it can be made safer. For example, Consumer Reports said the name Autopilot gives drivers a “false sense of security” and that “these two messages—your vehicle can drive itself, but you may need to take over the controls at a moment’s notice—create potential for driver confusion.”

Tesla says it willingly withdrew from NTSB investigation

Tesla says it willingly withdrew from the party agreement with the National Transportation Safety Board, adding that the NTSB is more concerned with “press headlines than actually promoting safety,” a Tesla spokesperson told TechCrunch via email.

“Last week, in a conversation with the NTSB, we were told that if we made additional statements before their 12-24 month investigative process is complete, we would no longer be a party to the investigation agreement,” a Tesla spokesperson said in a statement to TechCrunch. “On Tuesday, we chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot — claims which made it seem as though Autopilot creates safety problems when the opposite is true.”

This comes after the NTSB said it revoked Tesla’s party status in the investigation regarding the fatal crash involving one of Tesla’s Model X cars. The NTSB said it did so because Tesla, without permission from the NTSB, relayed information to the public regarding the investigation.

Tesla went on to note the prevalence of automotive fatalities in the United States in comparison to fatalities involving cars with Autopilot. Tesla says for every 320 million miles cars equipped with Autopilot drive, there is one fatality, including known pedestrian fatalities. That’s compared to one fatality for every 86 million miles driven for all vehicles, Tesla said.

“If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident and this continues to improve,” the spokesperson said.

Tesla also alleges its “clear in our conversations” with the NTSB that it cares less about safety and more about press headlines.

“Among other things, they repeatedly released partial bits of incomplete information to the media in violation of their own rules, at the same time that they were trying to prevent us from telling all the facts,” the spokesperson said. “We don’t believe this is right and we will be making an official complaint to Congress. We will also be issuing a Freedom Of Information Act request to understand the reasoning behind their focus on the safest cars in America while they ignore the cars that are the least safe.  Perhaps there is a sound rationale for this, but we cannot imagine what that could possibly be.”

Tesla also took time to note how the NTSB is an advisory body, rather than a regulatory one, and how Tesla has a “strong and positive relationship” with the National Highway Traffic Safety Administration (NHTSA)

“When tested by NHTSA, Model S and Model X each received five stars not only overall but in every sub-category,” the Tesla spokesperson said. “This was the only time an SUV had ever scored that well. Moreover, of all the cars that NHTSA has ever tested, Model S and Model X scored as the two cars with the lowest probability of injury. There is no company that cares more about safety and the evidence speaks for itself.”

When reached for comment pertaining to Tesla’s claims, the NTSB says it stands by the press release it issued earlier and has nothing to add. Meanwhile, the NHTSA says its investigation is ongoing.

“The agency is in contact with local investigators, consistent with NHTSA’s vigilant oversight and authority over the safety of all motor vehicles and equipment,” an NHTSA spokesperson said in a statement to TechCrunch. “NHTSA is dispatching its Special Crash Investigation Team and will take action as appropriate.”

Tesla says fatal crash involved Autopilot

Tesla has provided another update to last week’s fatal crash. As it turns out, Tesla said the driver had Autopilot on with the adaptive cruise control follow-distance set to minimum. However, it seems the driver ignored the vehicle’s warnings to take back control.

“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision,” Tesla wrote in a blog post. “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

The promise of Tesla’s Autopilot system is to reduce car accidents. In the company’s blog post, Tesla notes Autopilot reduces crash rates by 40 percent, according to an independent review by the U.S. government. Of course, that does not mean the technology is perfect in preventing all accidents.

As Tesla previously noted, the crash was so severe because the middle divider on the highway had been damaged in an earlier accident. Tesla also cautioned that Autopilot does not prevent all accidents, but it does make them less likely to occur.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

This development, of course, comes in light of a fatal accident involving one of Uber’s self-driving cars in Tempe, Arizona.