New research shows that people generally approve of cars programmed to sacrifice their passengers to save others, but these same people would not want to be in the car themselves.
– “Should your driverless car KILL you to save pedestrians?” Daily Mail, June 24, 2016
It has been reported that the temperature set point adjustment on thermostats in many office buildings in the United States is non-functional, installed to give tenants’ employees [an] illusion of control. In some cases they … serve no purpose other than to keep employees contented.
– Wikipedia entry on
It’s ironic that it’s now 2016 going on 2017 and the biggest problem on the horizon for us as a society is The Trolley Problem.
Trolley’s were all the rage in the late 1800s and they had a good run for a while, but now that it’s the computer age – a century and a half later – it seems more than a little retro for us to all of the sudden be talking about trolley cars and trolley problems – however, I guess we have no choice.
But please pardon me because I’m getting ahead of myself: I’m putting the trolley before the horse so to speak, so let’s back up and start from the beginning.
Here’s the thing: Driverless cars are coming. They’ll be here before you know it and they are on the way whether you like it or not. The Apple electric driverless car – to be called “the iCar” – won’t be here until September 2020, but right this minute, in California around Google labs, driverless cars are flying along closed roads and, come to think of it, my dad’s street legal new car stops for him if he forgets to brake, and his car takes control in other situations as well.
I feel more than a little personally responsible for this new craze. My insurance company tells me all the time that I have pioneered driverless car ownership, because, they say, as best they can figure out, every car I’ve ever had has been driverless.
So anyway, the future is coming and, in some respects, is already here. Also, if you are saying to yourself, “Bah Humbug, at least I don’t have to get on board; I’m going to continue driving my car myself” – well, I have some very bad news for you. In the future, you won’t be driving your car around because it will be illegal for humans to drive cars.
I’ll tell you exactly how it will happen. At first, people will be greatly worried about the safety of driverless cars, but then, after a few years, they’ll look at the stats and realize that the robot drivers are killing a whole lot less people than human drivers. By then, most people will have switched to driverless cars anyway; and then one day the government will say, “Oh, by the way, it is much too dangerous to have humans in control of something as deadly as a car, and, as of Jan. 1, it will be illegal for you to drive your own car.”
And while driverless cars may do better than people, here’s the thing: There are still going to be situations where accidents are unavoidable and the car will have to decide what and who to hit.
As human drivers, we know how to react when the unexpected happens. Like, when you are speeding along in winter and you hit a patch of ice, as humans with driving experience we all know to slam on breaks, close our eyes so we aren’t distracted and turn the wheel as fast as possible away from the slide. But computers won’t know to do that unless we program it into them.
We’ll also have to program the cars to decide who to maim and kill when there’s no other choice. It might be, for instance, that a runaway baby carriage shoots out into the path of the car and the car’s choices could be something like …
- Keep straight and hit the baby carriage
- Veer to the right and hit a car with five people
- Veer to the left and drive you off the side of a mountain
What should it do? That’s the one giant hurdle society needs to clear before driverless cars take over the roads. Ironically, in other words, to figure out the driverless car dilemma, we need to figure out …
The Trolley Problem. Now, back when I used to teach ethics, I would teach the Trolley Problem. Well-known ethicist Philippa Foot came up with the enticing enlightening ethical enigma about 50 years ago. The Wikipedia summary is a pretty good one …
There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the sidetrack. You have two options:
Do nothing, and the trolley kills the five people on the main track.
Pull the lever, diverting the trolley onto the side track where it will kill one person.
Which is the correct choice?
In ethics class, when we talked about this, a lot of students said that obviously the right thing to do is pull the lever, because that saves the most people. However, then, the thing I would point out is that pulling the lever is taking an action to kill someone. While it’s true that, the other way, five people die rather than one, it’s not like they died because of something you did – as is the case when you opt to pull the lever and kill the one person.
When it comes to cars driving around in the near future, the car will have a morally difficult “trolley dilemma” anytime a sudden situation forces it to choose who to kill.
Now, in some less terrible scenarios, the right choices would be pretty obvious. For instance, if it’s a case where there is a dog on one side and cat on the other, well, the car should veer toward the cat since cats are more plentiful and easier to replace (and, quite frankly, less loveable). But what if there were, say, two cats on one side and one dog on the other? Right, the car should still kill the cats – but say you had four or five cats on the right and one dog on the left? Well, then it gets a little harder.
What we need is some sort of ethical calculus, a ranking system that we can program into the car, so it can go down the list and decide who to hit.
Like on the Titanic, where it was woman and children first, I think the same principal should still hold here and the car should run over the man rather than the woman when it has to choose. I’m sure some women will say that’s sexist but I feel confident those same people will agree with me 100 percent if we have that argument on a sinking ship.
Here’s kind of a rough ranking of who we should save in order of importance. The car can simply see which target ranks higher and mow down the lower-value target:
- A woman pushing a baby carriage
- A man pushing a baby carriage
- A woman walking with no baby carriage
- A man with no baby carriage
- Anyone male or female who is taking their sweet time crossing the road and is only out in the street in the first place because he or she has shown no consideration for the cars that are waiting for him or her to get out of the way
- A street mime
- If it’s a choice between hitting one of two cars, you would go with the car with the fewer people in it. If the number of people were the same in each car, then the driverless car should crash into the car with the lower Blue Book value.
- Rabbits and squ– wait, I just realized something …
We don’t have to worry about the trolley car calculus after all.
Here’s why. Ever since the rise of the machines in the last century, through the modern computer age, all machines have given us the illusion that we are in control but we really have no control over them whatsoever no matter what we think.
It was true of the toaster, which provides us a lever to control how done the toast got, but, whatever the setting, the toaster cooked the toast how ever it darn well pleased. Then the VCR let us program it but it always recorded something else, and, when it came to the clock, it only gave you two choices for the time – midnight or noon. We use computers to control Greensboro’s traffic lights and sync them together and I am sure they spend a great deal of time programing them to maintain a steady flow of traffic – but, in the end, what do the timed traffic lights do? Whatever the heck they want. They just mess with us and try to make our lives miserable. They are like Greensboro’s version of Skynet. And the street crossing buttons are also a good example because they give us the illusion of control but in most places they are simply “placebo buttons” that make us feel better but aren’t even connected to anything.
No machine ever listens to us. The thermostat in the Rhino Times office sits five feet to the left of me all day long and I can tell you from experience that it has absolutely nothing to do with the temperature in the building. The other day it was 98 degrees outside, the thermostat was set to 72 and the office was jumping constantly between 12 degrees below zero and 89. (The only thing I have been able to figure out so far is that the thermostat gets very angry if we close a door.)
Oh, and my iPhone calls who it wants when it wants no matter what I say.
And mark my words: Driverless cars will be the same way. They will do what they want too no matter what we tell them, so there’s no use even thinking about it.
Listen, every time there’s a new device or new invention, people always think it will listen to us – we say that this time it will do what we say. So if you want to spend your time doing so, you can go ahead and plan what driverless cars should do, and programming those instructions into them – and then, after that, why not go kick that football, Charlie Brown, because maybe this time Lucy really will hold it for you.