Cookie Notice

We use cookies to make parts of our website work, and to improve your visitor experience.
If you only allow necessary cookies, some features of our website may not work.

Engineering news

Driverless car study outlines public divide


Google autonomous car
Google autonomous car

A study carried out by MIT researchers has found a divide in public opinion on driverless cars

The public is conflicted when it comes to fully autonomous cars, taking a notably inconsistent approach to the safety of autonomous vehicles, should they become a reality on the roads, says a study by the Massachusetts Institute of Technology.

The research paper, entitled ‘The social dilemma of autonomous vehicles,’ found that while people would prefer autonomous vehicles to minimise casualties in situations of extreme danger, they would be much less likely to use a vehicle programmed that way.

The researchers conducted six surveys, using the online Mechanical Turk public-opinion tool, between June 2015 and November 2015.

The results consistently showed that people will take a utilitarian approach to the ethics of autonomous vehicles, one emphasising the sheer number of lives that could be saved. For instance, 76% of respondents believe it is more moral for an autonomous vehicle, should such a circumstance arise, to sacrifice one passenger rather than ten pedestrians.

But the surveys also revealed a lack of enthusiasm for buying or using a driverless car programmed to avoid pedestrians at the expense of its own passengers. One question asked respondents to rate the morality of an autonomous vehicle programmed to crash and kill its own passenger to save ten pedestrians; the rating dropped by a third when respondents considered the possibility of riding in such a car.

"Most people want to live in in a world where cars will minimise casualties," said Iyad Rahwan, an associate professor in the  Massachusetts Institute of Technology (MIT) Media Lab and co-author of a new paper outlining the study. "But everybody wants their own car to protect them at all costs." 

The result is what the researchers call a ‘social dilemma,’ in which people could end up making conditions less safe for everyone by acting in their own self-interest.

Rahwan said, "If everybody does that, then we would end up in a tragedy, whereby the cars will not minimise casualties."

"For the time being, there seems to be no easy way to design algorithms that would reconcile moral values and personal self-interest," Rahwan added.



Professional Engineering magazine

Professional Engineering app

  • Industry features and content
  • Engineering and Institution news
  • News and features exclusive to app users

Download our Professional Engineering app

Professional Engineering newsletter

A weekly round-up of the most popular and topical stories featured on our website, so you won't miss anything

Subscribe to Professional Engineering newsletter

Opt into your industry sector newsletter

Related articles