Green Fleet Top News

NTSB Faults Trucker and Motorist in Fatal Tesla Crash

September 12, 2017, by David Cullen

Joshua Brown's Tesla sedan, after the crash with a truck that took his lifePhoto: NTSB/Florida Highway Patrol
Joshua Brown's Tesla sedan, after the crash with a truck that took his life. Photo: NTSB/Florida Highway Patrol

The National Transportation Safety Board has ruled that a truck driver’s failure to yield the right of way and a car driver’s “inattention due to over-reliance on vehicle automation” are the probable cause of the May 7, 2016, collision of a tractor-trailer and a Tesla Model S 70D sedan operating in autonomous mode.

The first fatal crash of an autonomous car in the U.S., the accident claimed the life of the Tesla’s driver, 40-year-old Joshua Brown, of Canton, Ohio.

In a statement on the report, which was issued on Sept. 12, NTSB said it also found that the operational design of the Tesla’s vehicle automation “permitted the car driver’s overreliance on the automation," noting its design "allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.”

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong.”

He added that “safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.”

Per NTSB, the report’s findings include:

  • The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate
  • The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations
  • If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains
  • The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement
  • Tesla made design changes to its “Autopilot” system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel.  The change also added a preferred road constraint to the alert timing sequence
  • Fatigue, highway design and mechanical system failures were not factors in the crash. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed the Tesla driver was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention
  • Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.

 As a result of its investigation, NTSB has issued seven new safety recommendations. One recommendation has been issued to the Department of Transportation, three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems, and one each to the Alliance of Automobile Manufacturers and Global Automakers. 

NTSB said the safety recommendations for autonomous vehicles address the need for:

  • Event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems
  • Manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards
  • Development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking
  • Manufacturers to report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems

The board also reiterated two safety recommendations that it had issued to the National Highway Traffic Safety Administration in 2013. These deal with minimum performance standards for connected vehicle technology for all highway vehicles as well as the need to require installation of the technology-- once developed-- on all newly manufactured highway vehicles.

The abstract of NTSB’s final report, which includes the findings, probable cause, and the safety recommendations is available online. The final report itself will be publicly released in the next several days. The webcast of the board meeting for this investigation will be available here for 90 days.

Related: What Does the Tesla Accident Mean for Autonomous Vehicle Development?


  1. 1. John McNeilly [ September 13, 2017 @ 05:11AM ]

    The real story appears to be that a truck driver turned in front of an oncoming vehicle, failed to yield to it, and that the driver of the truck making the the decision to turn was under the influence of an illegal drug at the time, sufficient to be measured in a post-accident drug test. The car under autonomous control was contributory in severity, but not the cause - that's on the truck driver. These kind of failure to yield accidents happen too often already, but this one seems to have drawn the attention of the NTSB because of the software in use in the car at the time. We fleets need to be diligent in outing substance abusers and in training our drivers better judgement in deciding when to slice through another vehicle's path of travel.

  2. 2. Steve [ September 13, 2017 @ 05:58AM ]

    Actually,the truck driver was going thru a green light and the Tesla didn't see the red light and was doing 90 mph. This story was written by Tesla. The police report with witnesses all said the truck driver was going thru a green light.

  3. 3. Scott [ September 13, 2017 @ 06:42AM ]

    If the truck was going through a green light, why does the first line of this article state that the NTSB ruled the truck driver failed to yield? Other headlines, available on-line directly attribute the crash to Tesla; so either this is an unrepresentative article or theirs are. Is there public access to the police report? Have witnesses ever been wrong?

  4. 4. Don [ September 13, 2017 @ 06:58AM ]

    Here is a link to the report:

    Google Maps shows no light at that intersection.

  5. 5. Tom [ September 13, 2017 @ 12:21PM ]

    I read the NTSB abstract on this accident and was shocked that they make no mention of the speed at which the Tesla was traveling. The trucker making the left turn does have a responsibility to yield to oncoming traffic, but he can't be expected to know if that traffic is traveling far in excess of the speed limit. Was the driver of the Tesla going 90 MPH? How can this no be in the NTSB report? The report does say the Tesla hit the truck tearing of his roof, continued on hit a culvert, two fences and sheared of a utility pole before coming to a stop. Clearly the primary fault of this accident is with the Tesla driver.

    Secondly why does Tesla get a "pass" from NTSB? The system is billed as "traffic aware cruise control". How can it be incapable of being "Aware" of a 13.6' by 53' trailer crossing in front of it? This is a traffic event that happens thousands of times a day in the U.S. Tesla has a over hyped and poorly designed system.

  6. 6. Kelvin [ September 13, 2017 @ 12:31PM ]

    Tom, there's no need to be hysterical in your obvious disdain for Tesla hype to the point you fail to understand facts. Note that the autopilot system for that generation of Tesla was made by MobilEye and after that incident Tesla quickly moved forward to end the relationship and develop their own in-house system. Note also that your "overhyped" MobilEye company was just bought out by Intel for $15.3 B. I guess you just know better than everyone else huh? Let us know when you advance vehicle automation to the next level for the rest of us, ok??

  7. 7. Tom [ September 13, 2017 @ 02:46PM ]

    I am not being hysterical. There are currently many cars and heavy trucks operating on our highways with true collision avoidance systems. They will detect closing rates that would result in a collision in 2.5 seconds reduce throttle and apply brakes. They work. They save lives. But they are not promoted as autopilots.

    Tesla is doing great things, but I think they got this one wrong


Comment On This Story

Comment: (Maximum 10000 characters)  
Leave this field empty:
* Please note that every comment is moderated.



Fleet Management And Leasing

Jack Firriolo from Merchants will answer your questions and challenges

View All

Grants & Subsidies

Alternative Fueling Station Locator

Alternative Fueling Station Locator

Find your closest station or plan a route. Locate biodiesel, electric, ethanol, hydrogen, compressed natural gas (CNG), liquified natural gas (LNG), and propane across America.

Start Your Search