Artificial intelligence decision-making programming is judged by the public in a new study from a US university

The public’s perspective on which decisions autonomous cars should make in fatal situations is being surveyed by the Massachusetts Institute of Technology (MIT).

MIT’s ‘Moral Machine’ poses numerous scenarios to the public in which an autonomous vehicle would need to decide who to kill. Respondents are given two choices, and in each, lives must be lost – there is no non-fatal option. To make each scenario and the victims of each clear, a written explanation is provided, in addition to the graphic demonstration. 

Individuals’ answers are then compared to the answer patterns to gauge where their answers fit on a series of scales, depending on different circumstances within the scenarios. 

For example, the results compare whether the individual favours young people over the elderly, protecting those upholding the law rather than those flouting the rules (for example, if a pedestrian walks into the road when the crossing light indicates not to cross), or protecting passengers in the autonomous vehicle rather than other road users.

Patterns have already appeared in users’ answers, including strong preferences towards saving the lives of younger people, people with ‘higher social value’. In the given examples, a doctor represents someone with high social value and a bank robber has low social value.

Another strong preference, unsurprisingly, was to save human lives, rather than the lives of pets. A near-50/50 split was reached in users’ preference between saving passengers’ lives, or other potential victims’ lives, as well as protecting physically fit people rather than overweight people. 

Sahar Danesh, IET principal policy advisor for transport, said: "The machine will always make the decision it has been programmed to make. It won't have started developing ideas without a database to tap into; with this database decisions can then be made by the machine. With so many people's lives at stake, what should the priority be? It's good to test it in this virtual context, then bring in other industries. 

The technology hasn't got as far as decision making software yet, and the regulation surrounding them is not yet in place, which is why these virtual platforms are so important. There has to be a platform and a consultation process before the programming is completed; bring in the insurance industry, legal experts, health professionals and ethical professors to clarify the debate. The more people we can bring together to help make these decisions, the better. Then the algorithms can be made. 

Machine errors are always judged more harshly than human errors, so this is a good opportunity to develop the moral criteria that would go into developing autonomous cars. It's good to gather intelligence to teach a machine ethics; human beings make decisions based on instinct, but a machine doesn't know how to do that. We need to gather this data to design programs to help it make decisions that a human would do - or ideally do."

The effectiveness of autonomous technology was called into question earlier this year, after a fatal collision occurred while Tesla's autonomous Autopilot software was activated. The UK government has also held a public consultation on autonomous cars and their future on Britain's roads.

The UK is to host the first autonomous vehicle track day, as autonomous vehicles become more prevalent on road and track.

Our Verdict

Tesla Model S 95D

In theory, this all-electric luxury car looks a hit. So is it in practice?

Join the debate

Comments
17

9 August 2016
Lol @ the 50% who would mow down overweight people over fit people or people with low social value over high social value. What about discriminating against sexual preference? Mow down people with a mincing gait? Or race? Mow down people who look foreign born over locals? What about people who are ginger, over blondes or dark haired people? Science is the answer, not morality. Rule one. Ensure the survival of maximum numbers. Rule two. Use medical science and accident statistics to enforce Rule one. Kids are more likely to survive a hit from a car than the elderly for example. The Inside / outside the car divison is meaningless in light of Rule 1. If a car with 5 people in can crash and allow the occupants to walk to save one person outside the car, and the statistics deem it so, then crash. To hell with the machine. Allow any number of animals to be mown down, to protect people.

9 August 2016
In reality there will be a tiny, miniscule number of scenario's where there is an equal possibility of survival, between disperate parties, perhaps akin in possibility to winning the lottery. In which case, use a random probability engine, AKA coin toss. It's not hard.

9 August 2016
Perhaps autonomous cars should just do as most people would, given that in a real scenario there is no time to make moral judgements about potential victims. Panic and slam on the brakes.

Citroëniste.

9 August 2016
The car should always protect its occupants, and presumably owners, over everything else otherwise that is biting the hand that feeds it. Otherwise it should do its best to minimize loss of life outside of the vehicle at all times. Presumably these vehicles are smart enough to know to react to things even if they are doing something illegal (i.e. jaywalking) and that it should be able to prevent any death or injury short of a person/animal coming out in front of the vehicle shorter than the distance it would take for the car to stop.

9 August 2016
1- A car may not injure a cat or dog or, through inaction, allow a cat or dog to come to harm.
2- A car must obey orders given it by human beings except where such orders would conflict with the First Law.
3- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law or involve a fraudulent insurance claim.

9 August 2016
Just a thought, will you need a driving licence or an ECDL to be in "control" of an automas car?

9 August 2016
I can see them having to have very different programming for different countries local tastes on this one. Personally I would selfishly have to go with whichever manufacturers put the people in my vehicle first. If you were considering two cars and one put other people's lives before your families you only ha e one choice.

9 August 2016
Cars should be highly sophisticated in classifying pedestrians. Automatic braking should not be applied when the car identifies a member of ISIS.

9 August 2016
Winniethewoo, if you are alone in your car and about to crash into 2 pedestrians will you really want to be driving the car that will sacrifice you to save them? Wouldn't you rather it kill them to save you? That could be the next purchasing decision you need to make.... Not so absurd really is it?

10 August 2016
lamcote wrote:

Winniethewoo, if you are alone in your car and about to crash into 2 pedestrians will you really want to be driving the car that will sacrifice you to save them? Wouldn't you rather it kill them to save you? That could be the next purchasing decision you need to make.... Not so absurd really is it?

Absolutely. Because this is what will make our roads the safest. It's how things are legislated for already. And also because this situation will be very rare, it most likely won't happen to you or I. Realistically if you are in a situation where pedestrians are about, which lets face it will most likely be in town, you will probably going less than 40mph. In a safe modern car, Like a Volvo XC90 or V40, you will most likely be able to walk away from a crash at that speed. I would rather scrap a car than kill people.

You need simple rules that people can just accept, that don't require masses of interpretation or mind bogglingly complex sets of if and buts. These rules will never be fair to all in every circumstance, but compromise is part of life and there are no right answers. I think the overall utility of driverless cars wil ensure people will accept a statistical chance that once in a blue moon, their car will choose to kill them over a pedestrian.

Pages

Add your comment

Log in or register to post comments

Find an Autocar car review

Driven this week

  • Volkswagen Golf MHEV
    First Drive
    23 November 2017
    VW's 48V mild hybrid technology is still a few years away from production, but we’ve sampled a prototype Golf fitted with it and are suitably impressed
  • Jeep Compass
    First Drive
    23 November 2017
    Jeep enters the competitive compact SUV market with its new Compass, blending ruggedness with contemporary styling and tech
  • BMW 1 Series Saloon
    We had a short drive in a China-only front-wheel-drive BMW 1 Series
    First Drive
    23 November 2017
    A brief drive in a China-only front-wheel-drive model shows the future is bright for the 1 Series when it makes the switch from RWD next year
  • BMW 5 Series
    First Drive
    23 November 2017
    The BMW 5 Series is top of the mid-exec pack, but is there still room for a diesel saloon in everyday family life?
  • Toyota Prius PHEV
    First Drive
    23 November 2017
    Does running a plug-in hybrid really make sense as a 500-mile-a-week driver? Six months with a Toyota Prius Plug-in should give a conclusive answer