Another Tesla Owner Recreates Autopilot Issue Where Fatal Model X Crash Occurred

Another Tesla Owner Recreates Autopilot Issue Where Fatal Model X Crash Occurred
Yesterday, we reported on a Tesla owner almost crashing on video trying to recreate the fatal Autopilot accident that happened in Mountain View last month.

Now, another Tesla owner recreated the situation at the exact same spot as the tragic accident – confirming previous potential explanations.

Last week, Tesla confirmed that the Model X involved in the fatal accident was on Autopilot before the crash and the Tesla community has been trying to figure out what exactly happened in order to better understand the driver assist system that they use every day.


Read Article

pcar4evrpcar4evr - 4/3/2018 3:45:37 PM
+10 Boost
Knowing that you are on a road crowded with other drivers, and on a road that was not designed for autonomy, I think its nuts to drive in autonomous mode. Where is the ease and convenience of "self-driving" on congested highways when you have to be on continuous alert for other drivers' as well as for mistakes by the software of your self driving vehicle.


SanJoseDriverSanJoseDriver - 4/3/2018 4:52:45 PM
-7 Boost
What is surprising is that this only started happening after the last update (which Telsa owners have been praising as a huge improvement, especially for curvy roads). Before that, there were no issues in this section. It could also be that they still haven't replaced the barrier that is supposed to be there and the lines on the road are fading away as you can see in the video. This needs to be patched asap.


pcar4evrpcar4evr - 4/3/2018 6:08:55 PM
+5 Boost
Tesla vehicles should automatically monitor the number of nearby vehicles and if too high, alert the driver that it is dangerous to be in autodrive. They could name this "Are you F'n nuts?" mode - goes well with "ludicrous".


mre30mre30 - 4/3/2018 5:34:31 PM
+5 Boost
What I find strange is that there are multiple Tesla drivers (or should we call them operators or end-users?) who rushed out and took time out of their busy days to recreate the situation that killed one of their fellow Tesla owners. All the while shooting a video about it and posting on Youtube.

When does NHTSA shut down autopilot? Oh, yeah, Trump never appointed anyone to head up NHTSA https://www.nhtsa.gov/about-nhtsa/nhtsa-leadership. Hopefully no pedestrians get killed.

Perhaps Tesla will learn from the Sh*tshow that happened to Uber and that poor woman in Arizona?




SanJoseDriverSanJoseDriver - 4/3/2018 7:08:45 PM
-8 Boost
Owners are passionate about the brand, they want to know what happened and contribute their experiences to other owners and/or Tesla so it can get fixed asap. Call it a beta if you want, but I would still rather use a beta product that is statistically safer than the alternative.


bw5011bw5011 - 4/3/2018 6:41:09 PM
+1 Boost
My SQ5 does that when the lines are split and I am coming up on another lane, it will take the outside predominant line. I like the feature but there is no way in hell, I would trust the car to drive itself. The car is doing what is supposed to do by continuing to go left in this video. Mine also turns off if the lines disappear because of old parts or new paving or a curb and no line. Of course the SQ5 doesn't clam to be autonomous either.

They have to start somewhere but this technology has a long ways to go. People will get hurt. I wonder how many people were hurt during the first few aircrafts or even the first few automobiles. We need to give it time and tell the everybody gets a trophy generation to wait.


SanJoseDriverSanJoseDriver - 4/3/2018 7:12:42 PM
-8 Boost
This is a problem that will eventually get solved. So far, only Waymo has cracked the code (albeit with way more expensive sensors). 0 disconnects on highways last year in full autonomous mode and only 1 disconnect every 50,000 miles for city driving. This year they aim to be the first to launch a driverless Level 4 service. If they can do it, other companies will catch up eventually.


MDarringerMDarringer - 4/3/2018 7:28:40 PM
+6 Boost
DARWIN


supermotosupermoto - 4/4/2018 10:11:00 AM
+9 Boost
Tesla lies again.

First, the new version of autopilot's codebase is largely re-written. So past safety statistics are no longer relevant - the new autopilot is a different product.

Second, Telsa, as always, distorts the facts. They just present statistics and expect the world to believe them, without sharing the data. They just mention the total amount of Tesla miles driven (not miles when autopilot is engaged) and they compare their vehicles to all automobiles when instead they should be only comparing against comparably priced vehicles (e.g. MB S-Class, BMW 7-series).




SanJoseDriverSanJoseDriver - 4/6/2018 2:45:48 AM
+1 Boost
Good point on the codebase, it could be less safe than previous versions. Will need to give it more time to see. General feedback has been really positive on it (way more capable on tough roads).

They are lumping the S, X, and the 3 together since they are all using the same version of Autopilot, think it still makes sense to compare it to cars in general. The question is whether autopilot is safer than driving a non-autopilot car, not another "premium" car. I'm also not sure if the stats will be much better (I'm assuming AMG/M cars get into more accidents).


Copyright 2026 AutoSpies.com, LLC