In an emergency, your driverless car decides who dies 

Driverless cars (and all other applications of artificial intelligence) are here to stay. For the first time, non-humans have to make life or death choices on our behalf. In an emergency, your driverless car decides who to save and who to sacrifice. It just might choose to sacrifice you.

On a fateful day, you are lounging in the back seat of your cool, driverless car as it heads down the road at 80 km per hour. The brakes fail. A tyre bursts. A child dashes out into the road from nowhere. Some other emergency happens. Or a combination of factors. Whatever the kind of emergency, the car has to resolve a dilemma. To the right, there’s a group of boys playing in the open spot. There are people on the sidewalk of the mall ahead. And there’s a deep ditch on the left. Certain death or incapacitation for you. This is out of your hands. Your driverless car decides.

Your car gets to decide who is more important and worth preserving – the homeless man on the side walk or the pregnant woman crossing the road. The madman who pulls his old truck into the road at breakneck speed, the innocent bystander who is going his way without a care or you. Granted, those scenarios are not likely to happen often, but they will happen.

emergency driverless car decides

Self-driving is Safe Driving

The good news here is that self-driving cars are designed first to be safe-driving cars. Human rashness and irrational spontaneity at the wheels are eliminated from the driving experience. That rashness and irrationality comes from other vehicles that have human drivers and from human pedestrians. The perfect safe driving scenario will be one in which every vehicle on the road is driverless. But that will take a long while to achieve.

Also read:  Do you know the meaning of 'i' in iOS and other Apple products?

At the end of the day, academic exercises around the moral dilemma of self-driving cars will remain just that. They are very much like the academic exercises around the privacy dilemma of smartphones. Human beings almost always choose convenience over privacy and safety. History suggests that we will embrace driverless cars with a passion regardless of what decisions those cars make in emergency situations. At least, most of us will.

Mister Mobility

I started blogging about mobile in 2004 as a fun way to share my passion for gadgets and mobile services. My other interests include digital media, speaking and teaching, photography, travelling, and dancing.

3 thoughts on “In an emergency, your driverless car decides who dies 

  • January 20, 2017 at 9:05 am
    Permalink

    Humans being creatures of EMOTION rather than LOGIC, I would trust a well programmed machine to take important decisions instead of a fickle, unpredictable, emotional human..

    I think this moral issue was addressed by Mercedes Benz.

    In a situation where a decision like this has to be made, the car takes the decision to protect the occupant of the car .

    That’s primary…

    But I feel the right thing is to try and save as many people as possible, even if your car has to kill YOU, the occupant.

  • January 21, 2017 at 10:49 am
    Permalink

    I’m curious as to how advanced the AI in a self-drive car is if the choice is between people on the pavement versus the occupant of the car. Wouldn’t the car – like a human – determine self preservation first?

    Until we encounter that scenario in the real world, we can only guess for now. After all we’re the programmers, not the programmed.

  • March 16, 2017 at 6:54 am
    Permalink

    Most people will not purchase a car knowing that it can decide to let them die instead of trying to save them at all cost.

Leave a Reply

Advertisements
Advertisements