Thursday, November 26, 2015

Should your car be programmed to kill you?

Imagine this scenario; you are in your new driverless car and a situation arises (doesn't matter how) where the driver (the computer) has to decide between crashing into a group of young school children, probably killing several, or slamming the car into a wall and probably killing the passenger, i.e. you! Simple ethics would recommend taking a least harm approach, but that means maybe killing you. Would you buy a car programmed to kill you? Or would you prefer to buy one that would make the less ethical choice and always seek to protect the car's occupants. These ethical dilemmas are coming to the fore with the advent of autonomous systems. Several years ago the UK's Royal Academy of Engineers published a report on the ethics of emerging technologies and autonomous systems. More recently MIT Technology Review posted a piece titled Why Self-Driving Cars Must Be Programmed to KillMy colleague, Paul Ralph, also just gave a radio interview on this subject.

No comments:

Post a Comment