Currently reading: US launches investigation into Tesla Autopilot system
Transport safety body begins probe after identifying 11 crashes involving Teslas close to emergency incidents

The US government has launched a formal investigation into Tesla's Autopilot advanced driver assistance system after a series of crashes involving Tesla cars and parked emergency vehicles.

The investigation by the US National Highway Traffic Safety Administration (NHTSA) follows 11 crashes, in which a total of 17 people were injured and one killed, and potentially affects 765,000 cars.

The noticed issued by the NHTSA covers virtually every Tesla sold in the US since 2014, including the Model S, Model X, Model 3 and Model Y.

The NHTSA says that its Office of Defects Investigation (ODI) has identified 11 crashes involving Tesla cars that occurred when they encountered "first responder scenes" being attended to by emergency services vehicles.

The body says that most of the incidents took place after dark and that the crash scenes included control measures such as emergency vehicle lights, illuminated road signage boards and traffic cones.

According to the NHTSA, every Tesla involved in the crash had either its Autopilot or Traffic Aware Cruise Control advanced driver assistance systems (ADAS) enabled on its approach to the accident scene. The cars involved subsequently struck one or more vehicles involved in the first responder scenes.

Autopilot is a level-two ADAS system, meaning it can control both the vehicle’s steering and speed, although the NHTSA notes in its statement that the driver retains “primary responsibility for Object and Event Detection and Response (OEDR)”.

The NHTSA said that its investigation will “assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot mode”. It will also look into any contributing circumstances for the crashes.

Autopilot has previously been investigated by US National Transport Safety Board (NTSB), which has recommended that the NHTSA require Tesla to introduce a better system to ensure drivers are paying attention when Autopilot is engaged.

In a report into a 2018 crash that was published last year, the NTSB determined that Tesla hadn’t done enough to prevent misuse of the system, but also that the NHTSA’s hands-off approach to regulating ADAS and related technology overlooked the risks of such systems.

READ MORE

Two killed in US crash while reportedly using Tesla Autopilot

Tesla releases early 'Full Self Driving' mode with strict warning

German court bans Tesla's 'misleading' use of Autopilot

James Attwood

James Attwood, digital editor
Title: Acting magazine editor

James is Autocar's acting magazine editor. Having served in that role since June 2023, he is in charge of the day-to-day running of the world's oldest car magazine, and regularly interviews some of the biggest names in the industry to secure news and features, such as his world exclusive look into production of Volkswagen currywurst. Really.

Before first joining Autocar in 2017, James spent more than a decade in motorsport journalist, working on Autosport, autosport.com, F1 Racing and Motorsport News, covering everything from club rallying to top-level international events. He also spent 18 months running Move Electric, Haymarket's e-mobility title, where he developed knowledge of the e-bike and e-scooter markets. 

Join the debate

Comments
4
Add a comment…
Torque Stear 17 August 2021

Currently the system while perfectly able to detect a stationary vehicle partially on the road, it isn't allowed to stear the car onto a lane of on coming traffic to go around it (even though it is perfectly capable of sensing the one coming cars). The FSD Beta can deal with this.

Suspect that this will just be dealt with using a bong and a dissconnection and autobrake, all being done with a over the air update.

Of course this news has knocked $30 billion off Tesla's value dispite being a $1 million fix.

Peter Cavellini 16 August 2021

It's basically a machine, the key thing that's missing is?, thinking like a human, and that fact at the moment is why an autonomous system fails.

TS7 16 August 2021
The problem is that a computer system is (currently) unable to anticipate, as would an experienced driver. It merely reacts, as would an inexperienced driver.