Drving Clicks
How's this for a headline: "Self-Driving Mercedes Will Be Programmed To Sacrifice Pedestrians To Save The Driver"? Posted to Fast Company back in 2016, it was brought to my attention today after being the subject of a post. The article is subtitled: "Mercedes gets around the moral issues of self-driving cars by deciding that–of course–drivers are more important than anyone else." It goes downhill from there.
At issue is a quote that one Christoph von Hugo, an autonomous vehicle safety manager, gave Car and Driver: “If you know you can save at least one person, at least save that one. Save the one in the car. If all you know for sure is that one death can be prevented, then that’s your first priority.” From this statement, Charlie Sorrel, the Fast Company columnist, presumes that Daimler-Benz is simply choosing to avoid hard moral questions to justify using pedestrians as airbags to protect the people who bought their cars.
But the moral calculus is actually more involved than it may appear. In effect, what Mr. von Hugo was saying is that the cars would be programmed to manage what they could actually control. Consider the following scenario: A driverless car has a blowout and suffers a partial loss of control at speed. Now, the car is going to hit a nearby wall, with a pedestrian standing in front of it; the car has control over how the impact occurs, but cannot prevent the impact. The car, in theory, knows enough about its own construction that it can determine what angles of impact would crush the passenger compartment and kill the driver. What the car can't know is what other external factors may contribute to the pedestrian being injured or killed. Say half the time, the car endeavors to save the driver and the other half of the time, the car attempts to spare the pedestrian. In situations where the car saves the driver, the driver survives, and the pedestrian is killed. In situations where the car acts to save the pedestrian, the driver is killed. But, and this is where things become interesting, because the car can't know the full set of external factors and variables, the pedestrian doesn't always survive. Because even though the car missed them when it struck the wall, there's no guarantee that something else doesn't happen that injures or kills the pedestrian that is a) a direct result of the accident but b) outside of the car's control.
And this leaves us with two scenarios: one where the car takes the sure thing, and another where it gambles. In situations where the car opts to save the driver, if things go as the car intends, there's always a 50% mortality rate. In situations where the care opts to save the pedestrian, the mortality rate will always be at least 50% but it may be higher, because there will be situations where the pedestrian is killed anyway. So the two situations are not equal. They may be close, but in the situation where the car gambles, and prioritizes the pedestrian, there is always a chance that both people will die. And if this is the logic that informed the programmers of the autonomous Mercedes, then one can see why they made the choice that they did.
A luxury carmaker makes for easy outrage mining. The headline is "shocking" enough that there are going to be some people who take it and run, without really digging any deeper. One day people will learn to be careful of clickbait on the internet, but it's going to take people being seriously burned for that to happen. Until then, class warfare will prove to be good, and easy to win.
No comments:
Post a Comment