Tesla Warns Owners That Autopilot Is An Assistant NOT A Replacement For The Driver

Tesla Warns Owners That Autopilot Is An Assistant NOT A Replacement For The Driver
Tesla officially touts its partial self-driving Autopilot feature as a back-up to human drivers, but some of the car maker's comments -- and some high profile videos -- have led many drivers to think of it as a replacement.

The questions of whether drivers are being lulled into a false sense of security by such technology, and who is legally to blame for an accident, have come into focus after it was made public last week that a driver using Autopilot died when he drove under a truck in May.


Read Article

MDarringerMDarringer - 7/5/2016 8:47:54 AM
+3 Boost
At this point, they are covering their asses for the coming bastard of a civil suit against them for the death of the driver watching Harry Potter. Of course, owners will also sue because Tesla's Autopilot is not really autopilot as represented to customers when the cars were purchased. This is deliciously good.


Agent009Agent009 - 7/5/2016 10:13:05 AM
-1 Boost
I'm afraid they have always said this, but the fan base was not listening.



Car4life1Car4life1 - 7/5/2016 12:24:31 PM
+2 Boost
Unfortunately people don't listen and it's human nature to want to push the envelope, so Tesla needs to take notes from the OG in the game, Benz, and literally have the car instruct the driver to take the wheel periodically...until the system is flawless and the majority of other cars on the road are autonomous.



mre30mre30 - 7/5/2016 10:53:47 AM
+1 Boost
Very sorry for the Harry-Potter watching vicitim's untimely demise...but boy does this scandal have legs.

Let me set up my lawn-chair for a few months of spectating!


MDarringerMDarringer - 7/5/2016 3:35:04 PM
0 Boost
You and me both. I'll bring the martini fixins.


TheSteveTheSteve - 7/5/2016 11:34:13 AM
-1 Boost
Tesla has ALWAYS positioned their product as a "driver assist system" and explicitly told owners that they must keep their hands on the wheel to take control, should that be necessary.

Unfortunately, Tesla did two things very badly:

(1) They called their product "autopilot," a term that most people have heard, and which they associate with what they understand is fully autonomous aircraft flight.

(2) They released a product which they knew was still in its infancy, while believing that the data gathered from beta testers' (anyone who uses autopilot) on-road user will be collected and used to iron out the bugs. This resulted in a lot of "bugs", or misdesigned or poorly implemented functionality.

FWIW, in the early days of Google's cars, they discovered that "driver assist" systems increased accidents because the driver would become inattentive, lulled into a false sense of security, so when the system misbehaved or failed, the driver could not become aware of the problem, assess the situation, take control, and perform the correct action quickly enough to prevent an accident. Google abandoned driver assist systems for exactly this reason!


PUGPROUDPUGPROUD - 7/5/2016 1:13:38 PM
-3 Boost
Nothing knew here...the human has always been the weakest link in the chain. People are people. Imagine when this technology is widespread across all of human kind. The possibilities for unforeseen actions and unintended consequences is infinite!


Copyright 2026 AutoSpies.com, LLC