In Sam Harris’ excellent The Moral Landscape: How Science Can Determine Human Values,[1] he puts forth an excellent experiment called The Asian Disease Problem. Here is how Harris presents the problem:
If Program A is adopted, 200 people will be saved.
If Program B is adopted, there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.
Which one of the two programs would you favor?
In this version of the problem, a significant majority of the people favor Program A. The problem, however, can be restated this way:
If Program B is adopted, there is a one-third probability that nobody will die and a two-thirds probability that 600 people will die.
Which one of the two programs would you favor?
Put this way, a majority of respondents will now favor Program B.
The reason this is interesting is because there is no difference between the options. In each case, Program A saves 200 people and kills 400 people, and Program B has a one-third chance of saving everyone and a two-thirds chance of killing everyone. To me it is easy. I am the most unlucky person alive. So if I have a two-thirds chance of killing everyone, I’m going to kill everyone. Therefore, I feel confident that regardless of how the problem was put to me (which it wasn’t—I was given both, just as you just were), I would go for the sure thing: save 200 lives. But what about other people? Would they really change their answers based upon how the question was asked?
I decided to find out, even though countless graduate students have done this to death. I asked 14 of my friends (it’s hard to believe I have that many, I know). The results I got were not as stark as I had expected, based upon what Harris wrote, but they were pretty much in accordance with what my intuition indicated.
When the first question was asked, four people picked Program A and two picked Program B. That makes sense: you are telling people they can save 200 people and you don’t even mention the 400 who will necessarily die. When the second question was asked, four people picked Program A and four picked Program B: an even split.
I don’t have much to add to this. According to Harris, what this shows is that people overvalue certainty. So, the certainty of saving 200 lives is really compelling. Just the same, the certainty of losing 400 lives is really repugnant. Thus, when we are asked, “Do you want to save 200 people, and by the way, 400 others will die,” we jump on that option. But when we are asked, “Do you want to kill 400 people, and by the way, 200 others will live,” we figure, “Well, maybe I’ll try my luck at saving them all, even though the odds aren’t that good.
The brain in a funny thing.
[1] I will be writing more about this just as soon as I can find my many pages of notes. You see, I just moved and things are even more out of control than usual. Plus, I am working on the pilot of the first Frankly Curious Media TV show called, The Post, Post Modern Comedy Hour. Just in case you were wondering: it’s a half-hour long. Oh! And here’s the theme song:
It’s the post, post modern comedy hour
You mega-dose of curiosity power
Out goes illusion! Out goes myth!
In comes compassion, knowledge, and bliss
It’s the post, post modern comedy hour!
(Video soon to come!)