DID YOU TAKE part in MIT’s Moral Machine questionnaire? Cumulatively, millions of people from 233 different countries made 40 million ethical decisions about whose life a driverless car should prioritise if some kind of collision is inevitable.
Well, the results are in.
In this grisly referendum on life and death – a bold move, given we can’t even be trusted to name a boat sensibly – the real winners were the young and the sociable. Given various scenarios, users around the globe consistently told the driverless car to squish animals and the elderly, and to spare the lives of people in groups and children.
For human drivers, our weak reactions mean these kind of calculations are purely theoretical. You may think you’d swerve into a dog to save a baby, but the split second you’d have to make the call would likely be insufficient to assess the species of canine, let alone consider what to do about it. Driverless cars, on the other hand, can be programmed to make these calls in advance.
Although the results broadly followed the same trends around the world, there were some interesting regional differences. The likes of China and Japan, for example, were less inclined to spare the young over the old, and they also cared less about saving the lives of those with lots of money compared to their European and North American counterparts.
It’s a squeamish topic and companies manufacturing self-driving technology don’t really want to talk about it, for the same reason that beer companies don’t spend much time discussing liver health. Still, the researchers feel that this kind of collective discussion should be used to help form the basis of driverless cars’ moral core.
Of course, such cold, clinical lab conditions seldom occur on real roads, where things get a little bit messy. Swerving one way or another could result in a slower speed of impact, and therefore less chance of death for anybody no matter their age.
Still, if you want to put your own brain through the 21st-century remix of the Trolley Problem, you can still do so here. µ
Source : Inquirer