Self Driving Cars So Far Have Not Addressed Ethical Decision Making

Self Driving Cars So Far Have Not Addressed Ethical Decision Making
 A large truck speeding in the opposite direction suddenly veers into your lane.

Jerk the wheel left and smash into a bicyclist?

Swerve right toward a family on foot?

Slam the brakes and brace for head-on impact?

Drivers make split-second decisions based on instinct and a limited view of the dangers around them. The cars of the future — those that can drive themselves thanks to an array of sensors and computing power — will have near-perfect perception and react based on programmed logic.


Read Article

quizzquizz - 11/24/2014 7:04:54 PM
+1 Boost
I assume the response with the least number of fatalities. More complicated still, our civil laws value the economic life of an adult greater than children, so if it's between hitting 2 pedestrian toddlers or injuring a pair of yuppies, it will be the children, because a lawsuit by the parents of the 2 children will end up being less than a lawsuit by the yuppies.


LJ745LJ745 - 11/24/2014 7:49:16 PM
+1 Boost
Drivers don't usually behave rationally in these instances and end up causing MORE harm. With integrated systems, there won't be head-on collisions with other vehicles, so that is out. Add an app to the smartphone the bike is carrying and voila, the car can track that too. This discussion of ethics in this situation is useful only insofar is it reminds us that academics are idiots.


Copyright 2026 AutoSpies.com, LLC