news

Fatal Tesla Autopilot crash due to ‘over-reliance on automation, lack of safeguards’

The United States' National Transport Safety Board (NTSB) has released its final findings on the fatal crash involving a Tesla Model S operating in semi-autonomous Autopilot mode.


The crash occurred in Florida in May 2016, when Joshua Brown's Tesla Model S collided with the underside of a tractor-trailer as the truck turned onto the non-controlled access highway.

Tesla's Autopilot system is a Level 2 semi-autonomous driving mode, which is designed to automatically steer and accelerate a car while it's on a controlled access motorway or freeway with well-defined entry and exit ramps.

According to the NTSB, Tesla's Autopilot functioned as programmed because it was not designed to recognise a truck crossing into the car's path from an intersecting road. As such, it did not warn the driver or engage the automated emergency braking system.

The report said the "driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations".

The NTSB's team concluded "while evidence revealed the Tesla driver was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention".

It also noted "the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence".

Tesla did not escape blame, with the NTSB calling out the electric carmaker for its ineffective methods of ensuring driver engagement.

In issuing the report, Robert L. Sumwalt III, the NTSB's chairman, said: "System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened".

Tesla has since made changes to its Autopilot system, including reducing the interval before it begins warning the driver that their hands are off the steering wheel.

As part of its findings, the NTSB also issued a number of recommendations to various government authorities and car makers with level two self-driving features.

These NTSB called for standardised data logging formats, safeguards to ensure autonomous driving systems are used only in the manner for which they were designed, and improved monitoring of driver engagement in vehicles fitted with autonomous and semi-autonomous safety systems.

Joshua Brown's family issued a statement through its lawyers earlier this week in anticipation of the NTSB's report.

"We heard numerous times that the car killed our son. That is simply not the case," the family said. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car.

"People die every day in car accidents. Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements."

MORE:Tesla Showroom
MORE:Tesla News
MORE:Tesla Reviews
MORE:Tesla Model S Showroom
MORE:Tesla Model S News
MORE:Tesla Model S Reviews
MORE:Search Used Tesla Model S Cars for Sale
MORE:Search Used Tesla Cars for Sale
MORE:Tesla Showroom
MORE:Tesla News
MORE:Tesla Reviews
MORE:Tesla Model S Showroom
MORE:Tesla Model S News
MORE:Tesla Model S Reviews
MORE:Search Used Tesla Model S Cars for Sale
MORE:Search Used Tesla Cars for Sale
Chat with us!







Chat with Agent