On Average Humans Having To Take Control Of Google Cars Over 20 Times A Month

On Average Humans Having To Take Control Of Google Cars Over 20 Times A Month
Consumer Watchdog today called on the National Highway Traffic Safety Administration to require a steering wheel, brake and accelerator so a human driver can take control of a self-driving robot car when necessary in the guidelines it is developing on automated vehicle technology.

In comments for a NHTSA public meeting today about automated vehicle technology, John M. Simpson, Consumer Watchdog's Privacy Project Director, also listed ten questions he said the agency must ask Google about its self-driving robot car program.

"Deploying a vehicle today without a steering wheel, brake, accelerator and a human driver capable of intervening when something goes wrong is not merely foolhardy. It is dangerous," said Simpson. "NHTSA's autonomous vehicle guidelines must reflect this fact."

Read Simpson's comments here: http://www.consumerwatchdog.org/resources/nhtsatestimony040816.pdf

The need to require a driver behind the wheel is obvious after a review of the results from seven companies that have been testing self-driving cars in California since September 2014, Consumer Watchdog said.

Under California's self-driving car testing requirements, these companies were required to file "disengagement reports" explaining when a test driver had to take control. The reports show that the cars are not always capable of "seeing" pedestrians and cyclists, traffic lights, low-hanging branches, or the proximity of parked cars, suggesting too great a risk of serious accidents involving pedestrians and other cars. The cars also are not capable of reacting to reckless behavior of others on the road quickly enough to avoid the consequences, the reports showed.

"Google, which logged 424,331 'self-driving' miles over the 15-month reporting period, said a human driver took over 341 times, an average of 22.7 times a month," Simpson said. "The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times."

"What the disengagement reports show is that there are many everyday routine traffic situations with which the self-driving robot cars simply can't cope," said Simpson. "It's imperative that a human be behind the wheel capable of taking control when necessary. Self-driving robot cars simply aren't ready to safely manage too many routine traffic situations without human intervention."

Questions for Google

Consumer Watchdog noted that Google is pressing NHTSA to create a fast-track approval process for its self-driving robot cars that would bypass usual rulemaking proceedings and Federal Motor Vehicle Safety Standards. Simpson said NHTSA should reject Google's proposal and instead ask the company ten tough questions as the agency develops its automated vehicle technology guidelines. They are:

1. We understand the self-driving car cannot currently handle many common occurrences on the road, including heavy rain or snow, hand signals from a traffic cop, or gestures to communicate from other drivers. Will Google publish a complete list of real-life situations the cars cannot yet understand, and how you intend to deal with them?

2. What does Google envision happening if the computer "driver" suddenly goes offline with a passenger in the car, if the car has no steering wheel or pedals and the passenger cannot steer or stop the vehicle?

3. Your programmers will literally make life and death decisions as they write the vehicles' algorithms. Will Google agree to publish its software algorithms, including how the company's "artificial car intelligence" will be programmed to decide what happens in the event of a potential collision? For instance, will your robot car prioritize the safety of the occupants of the vehicle or pedestrians it encounters?

4. Will Google publish all video from the car and technical data such as radar and lidar reports associated with accidents or other anomalous situations? If not, why not?

5. Will Google publish all data in its possession that discusses, or makes projections concerning, the safety of driverless vehicles?

6. Do you expect one of your robot cars to be involved in a fatal crash? If your robot car causes the crash, how would you be held accountable?

7. How will Google prove that self-driving cars are safer than today's vehicles?

8. Will Google agree not to store, market, sell, or transfer the data gathered by the self-driving car, or utilize it for any purpose other than navigating the vehicle?

9. NHTSA's performance standards are actually designed to promote new life-saving technology. Why is Google trying to circumvent them? Will Google provide all data in its possession concerning the length of time required to comply with the current NHTSA safety process?

10. Does Google have the technology to prevent malicious hackers from seizing control of a driverless vehicle or any of its systems?

Simpson's comments to NHTSA concluded:

"NHTSA officials have repeatedly said safety is the agency's top priority. You must not allow your judgment to by swayed by rosy, self-serving statements from companies like Google about the capabilities of their self-driving robot cars. NHTSA has said that autonomous vehicle technology is an area of rapid change that requires you to remain 'flexible and adaptable.' Please ensure that flexibility does not cause you to lose sight of the need to put safety first. Innovation will thrive hand-in-hand with thoughtful, deliberate regulation. Your guidance for the states on autonomous vehicles must continue to require a human driver who can intervened with a steering wheel, brake and accelerator when necessary."

Read our letter to the DOT and NHTSA and 10 questions for Google here: http://capitolwatchdog.org/sites/default/files/FoxxRosekind4-7-16.pdf

TomMTomM - 4/12/2016 8:13:15 AM
0 Boost
I think it is time to compare self driving cars with the other non-driving modes of transportation. If you are going to get there - a Bus or a Train has LOTS of advantages. Both are likely to be faster than a self driving car that will not exceed the posted speed limit (First problem) ever. They simply would not be allowed to program them to do so. A train on a main line - travels far faster. However - it is limited to the tracks. THe BUS - on the other hand - is driven by a sentient real human -who can avoid situations that a self driver will never even see. The bus limits your liability (You don't own it or drive it) - and is likely to be far less expensive on a trip. (Including cost of ownership of the vehicle). ANd of course the Airplane is far faster. The main advantage a self-driving car has is for short local trips. Ie-Convenience.

In return - you turn the driving experience into the equivalent of being hauled in a little red wagon - by an animal.

AND - you essentially eliminate the reason to buy premium cars - since ALL cars will drive essentially the same. Plus - once the politicians realize that they can and will have to mandate LIMITS on all cars to drive no faster than the self driving cars - we also eliminate a major portion of the Western world economy.

Or you can enjoy driving your own car. I will admit that for some who are no longer able (Or never were) to drive a car - that self drivers will open up many things to the shut ins - but A self Driving Car is like artificial insemination - you get there - but you have no enjoyment.


hangtime010hangtime010 - 4/12/2016 10:05:54 AM
+2 Boost
"Google, which logged 424,331 'self-driving' miles over the 15-month reporting period, said a human driver took over 341 times, an average of 22.7 times a month," Simpson said. "The robot car technology failed 272 times and ceded control to the human driver; the driver felt compelled to intervene and take control 69 times."

Did the watchdog group expect that in this testing period that the car would be 100%. I can't ever see autonomous cars being truly viable without communication between cars in a set radius. If all cars did "talk" to one-another, that would allow for each car to anticipate the drive/route ahead.
I wonder, when the car relinquished control to the driver for how long did it stay with the driver? Did the car take back control?
What are the circumstances of the failures they recorded?
After reading the pdf, it really sounds like a movie script. One where the politician is putting together all sorts of negativity in a document even though some of the incidents only happen once or twice (over exaggerated IMO). I'm sure there are more distracted drivers everyday that caused accidents (and many more near-misses) than the Google cars did over the past 15 months of testing.


t_bonet_bone - 4/12/2016 9:35:25 PM
+2 Boost
Sounds like a relaxing commute...most days it isn't IF but WHEN you have to intervene.

A bit like driving with a student driver.


Copyright 2026 AutoSpies.com, LLC