Here We Go AGAIN! Tesla Under Investigation AGAIN By NHTSA — This Time Someone Died

Here We Go AGAIN! Tesla Under Investigation AGAIN By NHTSA — This Time Someone Died
Unfortunately, a Tesla Model S driver using the Autopilot function was involved in a May 7 car accident that took his life. While the story seems clear depending on who you ask, one thing is a bit muddy at this moment: 

Did Tesla's Autopilot feature work as it should have?

Joshua Brown's Model S was involved in a collision with a tractor trailer when it crossed the highway. According to Tesla, the Autopilot system was thrown off by the trailer's white side, bright sun and elevated height. 

Given that I've had the chance to play with sensors, this completely makes sense. What makes this even more interesting is that Brown claimed to have played around with the vehicle and its respective sensors to see how vulnerable they were. Hell, he even shared video online of a close call he had when a utility truck got too close for comfort and the Model S swerved to avoid a run in. 

So, now what?

It's going to be a bit of the waiting game. We're waiting to see what NHTSA concludes.

That said, we're a bit curious: Should Tesla be held responsible from YOUR perspective, or should the electric vehicle maker walk because, ultimately, it's the driver's responsibility to control his/her vehicle?

Video of the Model S' save from April.



...The fatal crash has led to an investigation by the National Highway Traffic Safety Administration into Tesla’s autopilot offerings. Tesla announced the inquiry Thursday afternoon, after learning about it Wednesday evening, according to a company blog post

“It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations,” the blog post read.

According to Tesla’s recounting of the collision, a tractor trailer crossed a highway on which Brown was driving, and the truck’s white side combined with a bright sun kept the autopilot system as well as the driver from noticing the oncoming danger...


Read Article

TheSteveTheSteve - 7/1/2016 12:44:41 AM
-1 Boost
Let's wait until the details shake out to discover if Tesla's software has a serious and fatal flaw, or whether someone misunderstood (AKA wrongly assumed) what the product does and got themselves into deadly trouble.


Car4life1Car4life1 - 7/1/2016 7:53:23 AM
+2 Boost
I guess this makes a stronger case for Mercedes system which is completely autonomous but requires the drivers hands on the steering wheel at certain intervals to make sure the driver is still awake and alert


TomMTomM - 7/1/2016 8:46:26 AM
+1 Boost
No matter whether they have their hands on the wheel or not - or whether they have Auto-Pilot on or not - there are going to be accidents. People driving cars still do unpredictable things - all of us have experience the guy from the third land who plows across three lanes of traffic at the last second to reach an exit. And of course - we do not all drive at the same speed either.

I have already predicted that eventually - speed limits will be hard set by the road controlling sensors in your car preventing you from going faster. The government - once it knows such is available - will have a hard time NOT doing this. And that will take away some reason for driving yourself. And the problem is - the people will complain that the Government doesn't belong there. ANd people will try to defeat this as well. ANd for a computer to ANTICIPATE what speed other cars are actually going - for every vehicle near you - will take a real programming effort.


dumpstydumpsty - 7/1/2016 10:57:05 AM
+1 Boost
TomM makes a good point. A lot of real-time data to consider.

Automakers are already looking at in-vehicle communications systems which allow the individual vehicle to share basic data with the surrounding vehicles. A proper system where vehicles "talk" to each other & operate in-sync is the solution for zero accidents. Think "Minority Report". This system would also virtually eliminate stop-&-go rush hour flow & endless traffic jams.

I think GM had already developed a system that allows cars to form single-file lines on the highway & operate in-sync.


Car4life1Car4life1 - 7/1/2016 1:50:40 PM
+2 Boost
Accidents happen but fatalities could be prevented or produced from being alert and a system requiring you to pay attention. Just like Seatbelts dont prevent accidents but they reduce the likelihood of a fatality, a system like Mercedes which requires the drivers input every so often could have ensured this accident didn't turn fatal.


TheSteveTheSteve - 7/1/2016 11:22:16 PM
0 Boost
UPDATE: According to this story (http://hosted.ap.org/dynamic/stories/U/US_SELF_DRIVING_CAR_DEATH_ABRIDGED?SITE=AP) the Tesla "driver" was watching a Harry Potter DVD, likely because he (wrongly) assumed that Tesla's "autopilot" feature meant "100% autonomous", even thouugh Tesla clearly states that is not the case.

See my first post in this thread, where I recommend waiting for the facts to come out before we crucify or exonerate Tesla. I see from the voting that at least a couple of AutoSpies members thought that was a dumb idea, and that trial by social media was in order.


runninglogan1runninglogan1 - 7/1/2016 3:36:56 AM
-3 Boost
There have been over 120,000,000 miles driven so far using Tesla's Autopilot. This is the first death. Not to diminish the tragedy but these systems will never be perfect. Even modern airliners with triple redundancies encounter catastrophic failure. Many times due to pilot error. These systems will certainly get better as time goes by. As of now, an alert driver is absolutely required.


vdivvdiv - 7/1/2016 7:43:37 AM
0 Boost
Gov't needs to worry about both because both kill people.


MDarringerMDarringer - 7/1/2016 8:33:48 AM
+7 Boost
I see it as shared liability. If you're driving and your hands aren't on the wheel and you die as a result, you're responsible. HOWEVER, Tesla is also liable for putting out a technology that is not ready. If the sensors can be confused, then the vehicle is not safe. It cannot take over driving because it has the capacity to cause injury and death to the occupant.

Tesla owners think their cars are game changing, but they aren't. Because Tesla does not have the engineering staff, experience, and money as say Mercedes, they simply cannot engineer a vehicle as well as a real automaker.

Now factor in Musk running the company like Hitler and speaks in hyperbole about his products, it follows that he pushes things out before they're ready. The Model X is proof of that...poorly engineered....deeply flawed...a nightmare to own...and pushed to production anyway.


supermotosupermoto - 7/1/2016 10:10:00 AM
+7 Boost
My co-worker drove his Tesla on an 800 mile road trip from San Francisco to Los Angeles and back. I asked how the autopilot was and he said overall it was amazing but it did try to kill him four or five times. It did things like all of a sudden take a highway off-ramp - for no reason.


PUGPROUDPUGPROUD - 7/1/2016 10:34:09 AM
+8 Boost
Couldn't tell the difference between the white side of a tractor trailer and the sky so someone died...Oooops!


Copyright 2026 AutoSpies.com, LLC