Currently reading: Tesla Model X involved in Autopilot-related crash in US
Third known incident involving Tesla's autonomous driving function occurs in California; driver and passenger unharmed

A Tesla Model X using the manufacturer’s Autopilot system was involved in a crash in the US on Sunday.

The driver and passenger were unharmed, and Tesla suggested the system was not being used properly.

Data shows that Autosteer was used on an undivided mountain road, and the driver’s hands were off the steering wheel when the electric SUV swerved and hit a post next to the road in Montana, California.

Tesla’s terms of use recommends keeping hands on the steering wheel at all times when Autosteer is used, and the manufacturer has advised against using it at high speeds on undivided roads.

It is the third known crash involving Tesla’s autonomous driving function, one of which resulted in a fatality when a Model S collided with a truck’s trailer in its path.

The other incident resulted in a Model X hitting a guard rail on a highway in Pennsylvania before flipping onto its roof. The driver and passenger were both injured, but police later charged the driver with careless driving. It is not yet clear how, or if, Autopilot was involved.

There have so far been no reported incidents in the UK involving Autopilot, and Tesla boss Elon Musk has said the company will continue with its autonomous plans.

Other companies' autonomous system developments are gathering pace.

Nissan has launched its Propilot system in a Japanese model. Propilot is set to be introduced in the Nissan Qashqai in the UK next year.

Jaguar Land Rover has also unveiled its autonomous ambitions for the future, pledging to test 100 self-driving models on roads in England over the next four years.

Read more: why autonomous cars are inevitable

Join the debate

Comments
8
Add a comment…
samjacobs 6 December 2016

What does it mean not used

What does it mean not used properly?
spqr 13 July 2016

Oxymoron

The system is called "autonomous" it should therefore require no driver (third party) input to work. If it does it is by definition not "autonomous". Tesla should therefore stop marketing it as such. As a practising lawyer it seems to me that as soon as people like Mr Musk, the boards of Audi, BMW, Mercedes-Benz etc realise that if (when/ it has already happened) their systems kill someone they could be liable to prosecution for corporate manslaughter they will forget about this idiocy until the technology is "mature". Probably around 2100 or at least after they are dead and gone.
Scratch 13 July 2016

Under used gps and mapping data

Perhaps too simple a question to ask - if the vehicle knows where it is, why is it not "clever" enough to know which roads are suitable and which are not? Multi-lane freeways/motorways should be easy. Tick. Anything else, maybe not.