Think There Are Too Many Recalls? Just Wait Until Self Driving Cars Hit The Road In Numbers

Think There Are Too Many Recalls? Just Wait Until Self Driving Cars Hit The Road In Numbers

Nascent advanced technology and safety systems already are showing up on vehicle recall lists and will become an increasing factor in industry warranty and compliance effort experts here say.

It’s easy to see why. Today’s typical luxury vehicle contains more than 1 billion lines of software code, say officials with Stout Risius Ross, a financial and legal advisory firm that has been compiling and analyzing intricate data on vehicle recalls and warranty costs for several years. That compares to less than 1 million lines of code for the Space Shuttle or 2 million lines for the average fighter jet, the company says.
 


Read Article

TheSteveTheSteve - 3/17/2016 1:17:38 PM
+1 Boost
This sure sounds like an alarmist thread :-(

Who here uses a computer? Okay, everybody... it's a web site. How many Windows updates have you received? Over 200 "important" ones, if you're on Windows 7. Do you perceive each software update as a "computer recall"? In reality, the vast majority of them are fixes for potential problems that you've likely never experienced, and never will.

When autonomous vehicles become more common, we'll have to rethink what we believe about vehicles, just as we've had to rethink what we believed about communications when phones became common (over letters), when email trumped snail-mail, when mobile phones surpassed landlines, when digital media surpassed music on hard media, and so on.

The world keeps changing. Some people fear change, based on their understanding of their currently embraced paradigm. The paradigm continued to evolve.


xjug1987axjug1987a - 3/17/2016 1:44:49 PM
+1 Boost
All this talk of "self driving cars", I'm really scratching my head. Do the car companies really trust the technology THAT much? I mean its one thing to sell a car to someone who can make a mistake and crash. Its another when you market your vehicle as "self driving" meaning I'm not driving it, so if there is a crash WHO is responsible? Seems to me the manufacturer, especially in our litigious society and no one is ever at fault for anything. Everyone is victim..


ChiAutoGuyChiAutoGuy - 3/17/2016 5:59:32 PM
+1 Boost
On the plus side - these self driving cars should also be able to drive them selves to and from the dealership - So we have that!


Vette71Vette71 - 3/17/2016 7:39:28 PM
+2 Boost
Steve is on the mark. Software systems are continuously upgraded and defects corrected. Many are upgraded nightly and the user is non the wiser. So there will actually be fewer back to the dealer recalls.

The article's comment that the Silicon Valley crowd doesn't get it is true. They are trying to develop an artificial intelligence system. To do that they need to understand how the human sees situations and acts. Instead of inventing software and trying it out directly in traffic, the silicon teams ought put the systems into thousand of vehicles and let humans drive them. Then they should look for the situations where the human does not do what the software would have told the car to do. The human didn't drive into the side of the bus, when the system was telling the vehicle to do that. WHY? WHY? WHY? The car companies are ahead of the game with their semi automated systems. Measuring when I accept or override mine can tell them a lot more about what full automation has to do then the approach Google et.al. are using.

If you have an EKG done these days that little $1000 EKG machine diagnoses your heart and tells the doc what may be wrong. It started 35 years ago with $100K mini computer based systems that did the analysis. But still designers had the cardiologists over-read and edit the results. They learned where they had to tweak their software. By doing so the software was continuously refined until years later the device can do it and the human can fully trust the results.


MorePowerMorePower - 3/18/2016 3:27:48 AM
0 Boost
You're wrong.

The systems are not trying to mimic human behavior at all. They are simply operating the vehicles in accordance to the rules of the road that have been included with their programming.




Vette71Vette71 - 3/18/2016 10:14:18 AM
+1 Boost
BIG PROBLEM. Humans do not follow the regulations to the letter of the law, and by nature will always try to find shortcuts. History proves that. The majority drive above the speed limit on interstates, do things like rolling stops at intersections, sometimes pass on the right, etc. etc. Since the majority of vehicles on the road for the next couple of decades will be human driven, autonomous vehicles will have to fit in to the existing situation.

Go back to the EKG example. Cardiologists were taught to take specific measurements on the graphical output and the sum of those measurements gave the diagnosis. Those were the "rules" you mention. Once out in practice they became good at pattern recognition and no longer followed the procedural rules exactly because it was faster that way. And from their work with patients they evolved slightly different diagnosis than the rules dictated. So they declared the new software to be "wrong" even though it followed the rules. Software engineers allowed the software decision criteria to be adjusted to fit how the cardiologists read things, and overtime evolved the systems to reflect the human approach. And then acceptance grew to the point where the software systems took over.

What I am saying is the the approach Google et.al. are taking by encoding the rules won't be successful because humans already tweak those rules. Rather then encode the rules a better strategy would be to develop artificial intelligence that would mimic the human and fit into the existing environment and be adopted at a faster pace.

Software is a great thing. But the many failures just in things like automotive dashboard control systems that replaced knobs with layered menus that took the human longer to perform a task point out that software developers who ignore the human way of doing things do so at their peril.




Copyright 2026 AutoSpies.com, LLC