To kill, or not to kill?
Driveless car faces moral dilemmas which should be solved by ethics or data? The research about the ethic of driveless car is undertaken by Media Lab at MIT and Culture and Morality Lab at the University of California Irvine where researchers try to address the following issues: “Should the car risk its passengers’ lives by swerving to the side—where the edge of the road meets a steep cliff? Or should the car continue on its path, ensuring its passengers’ safety at the child’s expense?”
Shariff and his colleagues from Media Lab MIT launched a Website called “Moral Machine” to help gather more information about how people would prefer autonomous cars to react in different scenarios where passenger and pedestrian safety are at odds. At this website, you can take a test “start judging”, that is to say, you need to decide where the car should hit and consequently, whom it should kill to save the others. Do you prefer to save young people or seniors? Women or men? Doctors or robbers? Should the car kill two passengers or five pedestrians? Take a test and help to gather the information about a human perspective on moral decisions made by machine intelligence, such as self-driving cars. And also be sure that it is an interesting experience to get to know your preferences and ethics!