Self Driving Cars Dirty Little Secret: You Are Twice As Likely To Crash With Google At The Wheel

Self Driving Cars Dirty Little Secret: You Are Twice As Likely To Crash With Google At The Wheel
The self-driving car, that cutting-edge creation that’s supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers.

The glitch?

They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well.


Read Article

TheSteveTheSteve - 12/18/2015 11:24:41 AM
+3 Boost
More accurate but much less sensational headline: "Google's Autonomous Cars Are More Likely to be Hit By Careless Drivers Than Your Average Car"

Examine the stats and the truths become evident:
- Goog's autonomous cars are over-represented in incidents
- Not a single incident has been attributed to Goog's autonomy

The causes of these incidents have been attributed to (a) other drivers, and (b) a human driver manually operating the car. I'm not a fan of autonomous cars, but I am a big fan of truth and understanding.


TheSteveTheSteve - 12/18/2015 11:29:23 AM
+3 Boost
Added note: I've even seen articles, reprinted here on 'Spies, that blame the Google cars for being "too careful." That's right. Because the Google car is so risk adverse -- never speeding, never making illegal turns, always coming to a full stop at stop sign -- humans blame them for "causing" accidents. I point the finger at:
- Inattentive human drivers
- Human drivers that assume the Google car will take the same risks as they will

How can you blame a Google autonomous vehicle for being rear-ended at a red light, and not taking evasive action into oncoming traffic? Well, for starters, "Another Google Car Involved In Accident" makes headlines that sell, while "Another Human Driver Hits a Google Autonomous Car" doesn't.


Agent009Agent009 - 12/18/2015 3:15:35 PM
+3 Boost
@ TheSteve,
You are correct,

They are not blamed in a single accident (that I am aware of) but there has been at least one incident where the car was to be ticketed for impeding traffic and there was no driver. That might be a hint in itself.


Remember they are being rear ended at twice the average rate which indicates they are fundamentally doing "something" different than the average driver and real drivers are having a hard time with it.

Who knows why?

Perhaps it is a simple as not keeping up with "real" traffic flow when it exceeds posted speeds? (a biggie down here in Texas where you can be ticketed for not keeping up with traffic and you are driving at the posted speed). Maybe applying the brakes at a changing light rather than entering the intersection under yellow and passing through. Either way the guy in the rear was thinking the car in front was going to do something other than it did.

Keep in mind both Google and Mercedes will tell you their cars drive like grannies. Which "might" be as dangerous as speeding in the wrong situation.

With that in mind if real traffic flow is 70 in a 60 and a self driving car is being overtaken at all angles, I can imagine the chaos a simple lane change could cause. A human could speed up to match traffic to minimize the risk execute the lane change then drop back to the posted speed. A self driving car would not do that and might actually “dart” into a lane causing a hazard.

Would a self driving car in a 70 mph zone speed up to 75 to pass a 65 mph truck on a two lane road to minimize the risk of a head on collision and then slow back to 70? I doubt it, it would have to make a decision outside of the defined parameters (exceed posted speed limit)

While it is a marvel they can even drive on the street, I still think California has it right to require a licensed driver at the helm to intervene it unusual incidents.

No doubt the industry will move in this direction eventually, however until the cars on the road with self driving out number those without, it will be tough. If most of the cars are driving alike via automation then then it becomes much easier.

While we can program in the tangibles, it is the intangibles that worry me at this time.



Vette71Vette71 - 12/18/2015 4:57:17 PM
+2 Boost
009 is on the money. Software controlled machines just don't have the learning ability of humans, be it good or bad habits. And experience teaches us a great deal about what to expect from our fellow humans and it generally works. Since the autonomous cars are a small minority in the mix they will bear the responsibility for the accidents. Perhaps they should be required to sport large "Student Driver" signs as a warning to other drivers to expect the unexpected.


jeffgalljeffgall - 12/20/2015 12:13:29 AM
+1 Boost
Why is a one surprised? They are acting like any other Lexus driver. Doing 10 under the speed limit in the left lane.


Copyright 2026 AutoSpies.com, LLC