Keep Both Hands On The Wheel - This Is Why You Shouldn't Trust A Self Driving Car Just Yet

Keep Both Hands On The Wheel - This Is Why You Shouldn't Trust A Self Driving Car Just Yet

Earlier this week, a Tesla Model S hit a barrier on the highway near Dallas, Texas. The driver, who fortunately wasn’t injured, first blamed Tesla’s Autopilot for the crash.

We now have footage of the accident and it actually shows a situation that the Autopilot probably shouldn’t be expected to be able to handle, at least not yet. Ultimately, it serves as a reminder not to trust the system without paying attention.

Following our articles on a series of accidents last year where the Autopilot was activated during or right before the crashes, some readers were confused on whether the driver or the Autopilot should be considered at fault.


 

 


Read Article

PUGPROUDPUGPROUD - 3/2/2017 8:03:22 PM
0 Boost
Nice recovery though!


mre30mre30 - 3/2/2017 8:04:13 PM
+2 Boost
Yikes -

Glad s/he wasn't hurt; glad nobody else was hurt; looks like a total - so Tesla gets to sell another one; at least it didn't burst into a huge fireball and incinerate the vehicle and the surrounding area (like that accident in France).

Question - lots of other cars - Volvo's, Mercedes, BMW's - now have automated driving systems that (while not as relentlessly hyped as Tesla's systems) don't generate Youtube videos about the accidents. Are we to assume that the drivers of those makes don't zone out (or play with their 'Snapchat Spectacles' [see Tesla floor] while drving) or are they just more cautious?

Are the other manufacturer's systems better?


SanJoseDriverSanJoseDriver - 3/3/2017 9:20:07 PM
-6 Boost
Definitely unfortunate that the car didn't pick up this edge case (and that the driver was paying attention). No other car has an automated system that takes full control over steering and acceleration at 65mph+, so your comment does not apply. Volvo, Mercedes, etc. only have this capability at very low speeds.


MDarringerMDarringer - 3/3/2017 8:19:10 AM
+4 Boost
Self-driving cars should be illegal.

Driver assist technology is fine, but I don't want some rainbow unicorn snowflake watching Harry Potter trusting the faulty software of his automotive appliance and abdicating his obligation to being a responsible driver.

Self-driving cars make us all more exposed to danger on the roads.


Agent009Agent009 - 3/3/2017 8:54:13 AM
+3 Boost
Right now they are drivers assist and people are lulled into a false sense of security and trust too much.

BTW on the drive home in Dallas yesterday my commute took me past 4 different construction zones very similar to this scenario. So this issue is far more common than one would think.


SanJoseDriverSanJoseDriver - 3/3/2017 9:26:42 PM
-2 Boost
Autopilot v1 is already 2x safer than an average driver, so your comment about exposure is the inverse of reality. Self-driving is the future (all major manufacturers agree), there is no way around that.

With Autopilot v2 Tesla is being way more conservative and it is not yet available over 50mph. V2 should be able to get to 9-10x safer than an average driver but is a completely different system that will need billions of miles of real-world usage to get to its full potential. Right now it is just slightly better than v1, but there have been 0 incidents so far. Next month will be the real test when v2 is allowed to go highway speeds.


7msynthetic7msynthetic - 3/3/2017 6:20:06 PM
+3 Boost
"nice recovery though" ??? lol the barrier did all the work.

On a more serious note >> Autonomous driving needs alot of work, so much that Tesla needs to test alot more before it is available. Mark my words that something catastrophic will happen related to autonomous driving where Tesla will lose a class action that will sink their already cash poor company. And all you suckers that put deposits down on a car that doesn't even exist won't see a dime.


SanJoseDriverSanJoseDriver - 3/3/2017 9:29:32 PM
-5 Boost
That situation is unlikely given that drivers are still told to pay attention and are still ultimately responsible. That will not be the case when full self-driving is enabled, but that won't be for at least 1-2 years and by that time all Model 3 preorders should be fulfilled ;)


7msynthetic7msynthetic - 3/5/2017 2:58:20 PM
+1 Boost
Right, but what happens before then? A big crash, a few people die = class action = no Model 3 for five more years.




PUGPROUDPUGPROUD - 3/5/2017 9:30:43 PM
+1 Boost
Regarding the nice recovery notice that the car corrected the high speed bounce off the barrier and kept the car from entering the adjacent lane. Didn't even touch the white lane line. Not many drivers are capable of that saving maneuver on their own even when in complete control.


Copyright 2026 AutoSpies.com, LLC