Tesla crashes, causes chain accident with police car, ambulance
Police say the Tesla was possibly on “Autopilot” at the time of the crash and are investigating whether the driver was drunk
The Mercury News
BENSON, Ariz. — Police are probing possible drunk-driving in the case of a Tesla driver in Arizona who said he was using the Bay Area electric car maker’s controversial “Autopilot” system when his sedan smashed into an unoccupied police vehicle, which then hit an ambulance.
The crash occurred Tuesday on an Arizona highway, according to the state’s Department of Public Safety. “We can confirm the driver indicated to troopers the Tesla was on autopilot at the time of the collision,” the department tweeted, adding that the 23-year-old male driver was being investigated for driving under the influence.
🚨 Reminder: Please #SlowDown & #MoveOver when you see flashing lights & vehicles stopped on the side of the road! Today, a Tesla rear-ended a patrol vehicle at the scene of an earlier crash on I-10 EB near Benson. Luckily, our sergeant wasn’t in the vehicle & wasn't hurt. (1/2) pic.twitter.com/WZhUQ10StL— Dept. of Public Safety (@Arizona_DPS) July 14, 2020
The police sergeant who had driven the department’s SUV was not in it at the time of the crash, and the ambulance occupants were not hurt, the department said. The Tesla driver was hospitalized with serious but not life-threatening injuries, police said.
Tesla did not immediately respond to a request for comment. After a fatal 2018 accident involving a Tesla on Autopilot in Mountain View, the Palo Alto company said that “Autopilot can be safely used on divided and undivided roads as long as the driver remains attentive and ready to take control,” the National Transportation Safety Board noted in a report. In a 2018 blog post, Tesla claimed Autopilot makes crashes “much less likely to occur,” arguing that “No one knows about the accidents that didn’t happen, only the ones that did.”
Crashes involving Tesla’s Autopilot driver-assistance system have sparked multiple investigations by the federal safety board. The agency found a Tesla driver’s over-reliance on the automated system was a factor in a 2016 fatal Model S crash in Florida, and determined that in 2018 in Mountain View, Autopilot steered a Tesla Model X SUV into a Highway 101 barrier, a collision that caused the driver’s death.
After another fatal Florida crash, between a Model 3 sedan and a truck in March 2019, the agency blamed the driver’s over-reliance on automation and Tesla’s design of the Autopilot system as well as “the company’s failure to limit the use of the system to the conditions for which it was designed,” it said in a report.
The report noted that after the 2016 Florida crash, which involved a collision between a Tesla and a truck, the agency recommended that Tesla and five other car makers using automated systems develop technology to “more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking.” However, while the five other companies responded with descriptions of their planned solutions, “Tesla was the only manufacturer that did not officially respond,” the report said.
The agency also found Autopilot was a factor when a Model S slammed into the back of a fire truck on I-405 in Culver City near Los Angeles in 2018. The driver was also to blame in the non-injury collision, for using Autopilot in “ways inconsistent with guidance and warnings from the manufacturer,” the agency reported.
In December, the National Highway Traffic Safety Administration had announced its 12th investigation into a Tesla crash possibly tied to Autopilot. In that accident, a Model 3 rear-ended a parked police car in Connecticut.
©2020 the San Jose Mercury News (San Jose, Calif.)