MIT’s Moral Machine, how should self-driving cars react when a crash is inevitable?

By
Ovais Jafar
|
MIT’s Moral Machine, how should self-driving cars react when a crash is inevitable?

When four out of five accidents are caused by human error – driver forgetfulness, recklessness, overconfidence and distractions – that makes a good enough argument in favour of self driving cars where computers react pragmatically making decisions based on data and calculations instead of impulses.

However those who stand against artificial intelligence taking over from humans argue, that driving requires split-second decisions where there may not be any right answer.

When a squirrel or cat darts onto the road, do you risk swerving and hitting other cars causing a pile up or do you continue driving straight on, hoping the animal survives?

To address similar situations a team at Massachusetts Institute of Technology (MIT) has developed the Moral Machine, a platform that looks towards users to solve moral dilemmas asking how self driving cars should respond.

Speaking to Digital Trends, MIT research assistant Sohan Dsouza argued that self driving cars are practically inevitable and said “they would help save countless lives now being lost daily due to human driver error and can offer independent mobility to countless others who cannot drive.”

Aiming to address the problems that may arise for scenarios where moral decisions need to made, the Moral Machine program asks participants help decide 13 dilemmas, giving them additional information such as age, gender, social-status, whether the victims are breaking the law or not in each scenario.

The results being pooled together could possibly one day guide the development of ethical computing machines.

But while the possibility of that is still in the distant future, the Moral Machine currently aims to start the conversation so people can debate and through the engagement help the team understand how humans perceive the morality of decisions made by computers in situations where an a car crashing into something or someone, is unavoidable.

While some of the questions posed by the MIT team are easy to answer others will make participants question their own morality. What would you do if you experience sudden brake failure and the option is mowing down three criminals or a doctor and two pregnant women? Easy to answer, but what if the options are mowing down two athletes or two large (seemingly unhealthy) individuals?

For those who want to try out the Moral Machine, you can judge the outcomes of each scenario here and possibly help shape the future of self driving cars.

 

 

Ovais Jafar is a Multimedia Journalist, he tweets as @ovaisjafar