— A National Transportation Safety Board (NTSB) Tesla crash investigation has concluded by saying the driver of the Model S, the driver of a tractor-trailer and the design of the Model S itself are all to blame in the death of former Navy SEAL, Joshua Brown.
The NTSB report blames the deadly crash on "driver errors," an "overreliance on automation" and a "lack of safeguards" designed into Tesla's "Autopilot" system.
The May 2016 Florida crash occurred when Joshua D. Brown, 40, of Canton, Ohio, was driving his 2015 Tesla Model S with Autopilot engaged when the car crashed into the side of a tractor-trailer that was crossing the highway to make a turn. Mr. Brown was known as a huge supporter of Tesla and autonomous technology and founded a company that worked on Internet and camera systems.
Investigators say the Model S went under the tractor-trailer and exited the other side of the trailer before going off the road and through two fences, finally hitting a utility pole. Brown was pronounced dead at the scene. The truck driver, Frank Baressi, 62, was not injured.
Tesla originally responded to the crash by saying the Autopilot feature is only to "assist" a driver, not take full control from a driver. The automaker had taken heat for marketing materials where Autopilot sounded more like "self-driving" technology that would take the driving responsibility from drivers.
After a crash in China that involved the Autopilot system, the automaker removed from its website a Chinese phrase that means "self-driving."
After the Brown crash, Tesla said a driver must keep their hands on the steering wheel at all times, something that seems contradictory for a system called "Autopilot." Tesla also said the car provides audible and visual warnings if the system catches the driver without their hands on the wheel.
Additionally, Tesla said the system makes frequent checks to verify a driver's hands are on the wheel, and the system will gradually slow down the car until it detects the driver touching the steering wheel.
Tesla also pointed out the Autopilot feature is disabled by default and a driver must acknowledge the technology is new and still being tested before the driver can enable the feature.
The Florida crash caused the National Highway Traffic Safety Administration (NHTSA) and the NTSB to open investigations, something the NTSB typically doesn't do for vehicle crashes. NHTSA closed it's investigation by finding no defects with Autopilot systems, and the NTSB in its final report makes seven recommendations to improve the safety of vehicles.
Safety investigators say the Brown crash was caused by the driver of the tractor-trailer failing to yield while crossing the highway and Mr Brown's inattention to the road and his Model S because of placing too much faith in the technology.
In addition, the NTSB places blame on Tesla for designing Autopilot to allow a driver to ignore the warnings and have a false sense of security in the technology.
“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.” - NTSB
The government found that Autopilot was not designed to, and could not identify the truck or trailer crossing the highway and didn't "see" the crash coming.
In this aspect the car completely failed and kept traveling 74 mph as if no obstacles were in the way. The car didn't slow down, the forward collision warning system didn't alert Brown to the impending impact and the automatic braking feature didn't activate.
The NTSB says Mr. Brown had a history of using the Autopilot system in ways the technology was not meant to be used and should have known about the limitations of both Autopilot and auto braking.
Tesla was also blamed for the design of the autonomous systems with how those systems monitored and responded to the driver’s interaction with the steering wheel. The feds say no effective methods existed to ensure the driver stayed engaged with the steering wheel and car.
Investigators determined mechanical system failures, fatigue and the design of the highway were not factors in the Model S crash. As for the truck driver, he wasn't using a cell phone, but he had used marijuana before the crash. However, although he was not attentive to the road or his driving, investigators couldn't prove from the available evidence what caused his inattention.
The NTSB issued its safety recommendations based upon its findings:
- "Event data should be captured and available in standard formats on new vehicles equipped with automated vehicle control systems."
- Automakers "should incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards."
- Autonomous companies should develop "applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking."
The NTSB report also recommends that all automakers "report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems."
As for Autopilot technology, Tesla updated the system after the Brown crash to reduce the period of time before the Autopilot system issues a warning or alert when the driver’s hands are off the steering wheel.