Chevy's Self Driving Bolts Are Racking Up The Wrecks - Why Is It Never The Software's Fault?

Chevy's Self Driving Bolts Are Racking Up The Wrecks - Why Is It Never The Software's Fault?

The autonomous Bolts GMs self-driving start-up has running around San Francisco have been involved in 22 accidents during 2017– none of which were the software’s fault, legally that is.

Cruise Automation has been using a fleet of self-driving Chevrolet Bolts to log autonomous miles in an urban environment since GM purchased the company for more than $1 billion in 2016. When you’re trying to disrupt personal transportation as we know it and develop a new technology standard there are bound to be a few incidents.


Read Article

MDarringerMDarringer - 12/26/2017 12:47:15 PM
0 Boost
"Why Is It Never The Software's Fault?" because to do so would be to admit the tech is nowhere near being ready.


Agent009Agent009 - 12/26/2017 3:59:44 PM
+1 Boost
I test software for a living. There are numerous assumptions in any code, so the software can definitely be at fault in certain situations. The code is only as good as the acceptance criteria and those understanding it.


Vette71Vette71 - 12/26/2017 5:18:04 PM
0 Boost
You both are saying the same thing. The sum total of all the assumptions in the software can't duplicate the intuitive nature of the human brain, and until they can the truly autonomous car remains out of sight. Folks working on AI will tell you it is a ways off. Ban the humans from the mix and it might work, but even then the software in a manufacturer's X vehicle might differ from that in manufacturer's A vehicle and there will still be "a failure to communicate".

Look at healthcare. 25 years ago a video called "Imagine" was produced showing the power the internet offered healthcare. One key element was a standardized language for a medical record that would enable a person to be treated anywhere and the treating physician would have access of all the records from anywhere else. Big help in an emergency when minutes count. It still doesn't exist since manufacturers cannot/won't agree on that language.


TheSteveTheSteve - 12/26/2017 2:29:45 PM
+6 Boost
If you read the article, each example they provided demonstrates the autonomous software making decisions that were prudent, and consistent with an attentive and cautious human driver. For example, the autonomous car decelerated as a bus entered its lane, and the autonomous car got rear-ended. How can this possibly be anyone's fault, other than the following-too-closely and inattentive driver who rear-ended the autonomous car?

I'm *NOT* an autonomous car fan, but when a collision happens and it's clearly not the autonomous vehicle's fault, then what's the problem with saying so?


Copyright 2026 AutoSpies.com, LLC