I'll tell you a little bit about irrational behavior. Not yours, of course — other people's.
So after being at MIT for a few years, I realized that writing academic papers is not that exciting. You know, I don't know how many of those you read, but it's not fun to read and often not fun to write — even worse to write. So I decided to try and write something more fun. And I came up with an idea that I would write a cookbook. And the title for my cookbook was going to be, "Dining Without Crumbs: The Art of Eating Over the Sink."
And it was going to be a look at life through the kitchen. I was quite excited about this. I was going to talk a little bit about research, a little bit about the kitchen. We do so much in the kitchen, I thought this would be interesting. I wrote a couple of chapters, and took it to MIT Press and they said,
"Cute, but not for us. Go and find somebody else."
I tried other people, and everybody said the same thing,
"Cute. Not for us."
Until somebody said, "Look, if you're serious about this, you have to write about your research first; you have to publish something, then you'll get the opportunity to write something else. If you really want to do it, you have to do it."
I said, "I don't want to write about my research. I do it all day long, I want to write something a bit more free, less constrained."
And this person was very forceful and said, "Look, that's the only way you'll ever do it."
So I said, "Okay, if I have to do it —" I had a sabbatical. I said, "I'll write about my research, if there's no other way. And then I'll get to do my cookbook." So, I wrote a book on my research.
And it turned out to be quite fun in two ways. First of all, I enjoyed writing. But the more interesting thing was that I started learning from people. It's a fantastic time to write, because there's so much feedback you can get from people. People write to me about their personal experience, and about their examples, and where they disagree, and their nuances. And even being here — I mean, the last few days, I've known heights of obsessive behavior I never thought about.
Which I think is just fascinating.
I will tell you a little bit about irrational behavior, and I want to start by giving you some examples of visual illusion as a metaphor for rationality. So think about these two tables. And you must have seen this illusion. If I asked you what's longer, the vertical line on the table on the left, or the horizontal line on the table on the right, which one seems longer? Can anybody see anything but the left one being longer? No, right? It's impossible. But the nice thing about visual illusion is we can easily demonstrate mistakes. So I can put some lines on; it doesn't help. I can animate the lines. And to the extent you believe I didn't shrink the lines, which I didn't, I've proven to you that your eyes were deceiving you. Now, the interesting thing about this is when I take the lines away, it's as if you haven't learned anything in the last minute.
You can't look at this and say, "Now I see reality as it is." Right? It's impossible to overcome this sense that this is indeed longer. Our intuition is really fooling us in a repeatable, predictable, consistent way. and there is almost nothing we can do about it, aside from taking a ruler and starting to measure it.
Here's another one. It's one of my favorite illusions. What color is the top arrow pointing to?
Audience: Brown. Dan Ariely: Brown. Thank you.
The bottom one? Yellow. Turns out they're identical. Can anybody see them as identical? Very, very hard. I can cover the rest of the cube up. If I cover the rest of the cube, you can see that they are identical. If you don't believe me, you can get the slide later and do some arts and crafts and see that they're identical. But again, it's the same story, that if we take the background away, the illusion comes back. There is no way for us not to see this illusion. I guess maybe if you're colorblind, I don't think you can see that. I want you to think about illusion as a metaphor.
Vision is one of the best things we do. We have a huge part of our brain dedicated to vision — bigger than dedicated to anything else. We use our vision more hours of the day than anything else. We're evolutionarily designed to use vision. And if we have these predictable repeatable mistakes in vision, which we're so good at, what are the chances we won't make even more mistakes in something we're not as good at, for example, financial decision-making.
Something we don't have an evolutionary reason to do, we don't have a specialized part of the brain for, and we don't do that many hours of the day. The argument is in those cases, it might be that we actually make many more mistakes. And worse — not having an easy way to see them, because in visual illusions, we can easily demonstrate the mistakes; in cognitive illusion it's much, much harder to demonstrate the mistakes to people.
So I want to show you some cognitive illusions, or decision-making illusions, in the same way. And this is one of my favorite plots in social sciences. It's from a paper by Johnson and Goldstein. It basically shows the percentage of people who indicated they would be interested in donating their organs. These are different countries in Europe. You basically see two types of countries: countries on the right, that seem to be giving a lot; and countries on the left that seem to giving very little, or much less. The question is, why? Why do some countries give a lot and some countries give a little?
When you ask people this question, they usually think that it has to be about culture. How much do you care about people? Giving organs to somebody else is probably about how much you care about society, how linked you are. Or maybe it's about religion. But if you look at this plot, you can see that countries that we think about as very similar, actually exhibit very different behavior. For example, Sweden is all the way on the right, and Denmark, which we think is culturally very similar, is all the way on the left. Germany is on the left, and Austria is on the right. The Netherlands is on the left, and Belgium is on the right. And finally, depending on your particular version of European similarity, you can think about the U.K. and France as either similar culturally or not, but it turns out that with organ donation, they are very different.
By the way, the Netherlands is an interesting story. You see, the Netherlands is kind of the biggest of the small group. It turns out that they got to 28 percent after mailing every household in the country a letter, begging people to join this organ donation program. You know the expression, "Begging only gets you so far." It's 28 percent in organ donation.
But whatever the countries on the right are doing, they're doing a much better job than begging. So what are they doing? Turns out the secret has to do with a form at the DMV. And here is the story. The countries on the left have a form at the DMV that looks something like this. "Check the box below if you want to participate in the organ donor program." And what happens? People don't check, and they don't join. The countries on the right, the ones that give a lot, have a slightly different form. It says, "Check the box below if you don't want to participate ..." Interestingly enough, when people get this, they again don't check, but now they join.
Now, think about what this means. You know, we wake up in the morning and we feel we make decisions. We wake up in the morning and we open the closet; we feel that we decide what to wear. we open the refrigerator and we feel that we decide what to eat. What this is actually saying, is that many of these decisions are not residing within us. They are residing in the person who is designing that form. When you walk into the DMV, the person who designed the form will have a huge influence on what you'll end up doing.
Now, it's also very hard to intuit these results. Think about it for yourself. How many of you believe that if you went to renew your license tomorrow, and you went to the DMV, and you encountered one of these forms, that it would actually change your own behavior? Very hard to think that it would influence us. We can say, "Oh, these funny Europeans, of course it would influence them." But when it comes to us, we have such a feeling that we're in the driver's seat, such a feeling that we're in control and we are making the decision, that it's very hard to even accept the idea that we actually have an illusion of making a decision, rather than an actual decision.
Now, you might say, "These are decisions we don't care about." In fact, by definition, these are decisions about something that will happen to us after we die. How could we care about something less than about something that happens after we die? So a standard economist, somebody who believes in rationality, would say, "You know what? The cost of lifting the pencil and marking a "V" is higher than the possible benefit of the decision, so that's why we get this effect."
But, in fact, it's not because it's easy. It's not because it's trivial. It's not because we don't care. It's the opposite. It's because we care. It's difficult and it's complex. And it's so complex that we don't know what to do. And because we have no idea what to do, we just pick whatever it was that was chosen for us.
I'll give you one more example. This is from a paper by Redelmeier and Shafir. And they said, "Would this effect also happens to experts? People who are well-paid, experts in their decisions, and who make a lot of them?" And they took a group of physicians. They presented to them a case study of a patient. They said, "Here is a patient. He is a 67-year-old farmer. He's been suffering from right hip pain for a while." And then, they said to the physicians, "You decided a few weeks ago that nothing is working for this patient. All these medications, nothing seems to be working. So you refer the patient for hip replacement therapy. Hip replacement. Okay?" So the patient is on a path to have his hip replaced.
Then they said to half of the physicians, "Yesterday, you reviewed the patient's case, and you realized that you forgot to try one medication. You did not try ibuprofen. What do you do? Do you pull the patient back and try ibuprofen? Or do you let him go and have hip replacement?" Well, the good news is that most physicians in this case decided to pull the patient and try ibuprofen. Very good for the physicians.
To the other group of physicians, they said, "Yesterday when you reviewed the case, you discovered there were two medications you didn't try out yet — ibuprofen and piroxicam." You have two medications you didn't try out yet. What do you do? You let him go, or you pull him back? And if you pull him back, do you try ibuprofen or piroxicam? Which one?" Now, think of it: This decision makes it as easy to let the patient continue with hip replacement, but pulling him back, all of the sudden it becomes more complex. There is one more decision. What happens now? The majority of the physicians now choose to let the patient go for a hip replacement. I hope this worries you, by the way —
when you go to see your physician. The thing is that no physician would ever say, "Piroxicam, ibuprofen, hip replacement. Let's go for hip replacement." But the moment you set this as the default, it has a huge power over whatever people end up doing.
I'll give you a couple of more examples on irrational decision-making. Imagine I give you a choice: Do you want to go for a weekend to Rome, all expenses paid — hotel, transportation, food, a continental breakfast, everything — or a weekend in Paris? Now, weekend in Paris, weekend in Rome — these are different things. They have different food, different culture, different art. Imagine I added a choice to the set that nobody wanted. Imagine I said, "A weekend in Rome, a weekend in Paris, or having your car stolen?"
It's a funny idea, because why would having your car stolen, in this set, influence anything?
But what if the option to have your car stolen was not exactly like this? What if it was a trip to Rome, all expenses paid, transportation, breakfast, but it doesn't include coffee in the morning? If you want coffee, you have to pay for it yourself, it's two euros 50.
Now in some ways, given that you can have Rome with coffee, why would you possibly want Rome without coffee? It's like having your car stolen. It's an inferior option. But guess what happened? The moment you add Rome without coffee, Rome with coffee becomes more popular, and people choose it. The fact that you have Rome without coffee makes Rome with coffee look superior, and not just to Rome without coffee — even superior to Paris.
Here are two examples of this principle. This was an ad in The Economist a few years ago that gave us three choices: an online subscription for 59 dollars, a print subscription for 125 dollars, or you could get both for 125.
Now I looked at this, and I called up The Economist, and I tried to figure out what they were thinking. And they passed me from one person to another to another, until eventually I got to the person who was in charge of the website, and I called them up, and they went to check what was going on. The next thing I know, the ad is gone, no explanation.
So I decided to do the experiment that I would have loved The Economist to do with me. I took this and I gave it to 100 MIT students. I said, "What would you choose?" These are the market shares — most people wanted the combo deal. Thankfully, nobody wanted the dominant option. That means our students can read.
But now, if you have an option that nobody wants, you can take it off, right? So I printed another version of this, where I eliminated the middle option. I gave it to another 100 students. Here is what happened: Now the most popular option became the least popular, and the least popular became the most popular.
What was happening was the option that was useless, in the middle, was useless in the sense that nobody wanted it. But it wasn't useless in the sense that it helped people figure out what they wanted. In fact, relative to the option in the middle, which was get only the print for 125, the print and web for 125 looked like a fantastic deal. And as a consequence, people chose it. The general idea here, by the way, is that we actually don't know our preferences that well. And because we don't know our preferences that well, we're susceptible to all of these influences from the external forces: the defaults, the particular options that are presented to us, and so on.
One more example of this. People believe that when we deal with physical attraction, we see somebody, and we know immediately whether we like them or not, if we're attracted or not. This is why we have these four-minute dates. So I decided to do this experiment with people. I'll show you images here, no real people, but the experiment was with people. I showed some people a picture of Tom, and a picture of Jerry. and I said, "Who do you want to date? Tom or Jerry?" But for half the people, I added an ugly version of Jerry. I took Photoshop and I made Jerry slightly less attractive.
For the other people, I added an ugly version of Tom. And the question was, will ugly Jerry and ugly Tom help their respective, more attractive brothers? The answer was absolutely yes. When ugly Jerry was around, Jerry was popular. When ugly Tom was around, Tom was popular.
This of course has two very clear implications for life in general. If you ever go bar-hopping, who do you want to take with you?
You want a slightly uglier version of yourself.
Similar, but slightly uglier.
The second point, or course, is that if somebody invites you to bar hop, you know what they think about you.
Now you get it.
What is the general point? The general point is that, when we think about economics, we have this beautiful view of human nature. "What a piece of work is a man! How noble in reason!" We have this view of ourselves, of others. The behavioral economics perspective is slightly less "generous" to people; in fact, in medical terms, that's our view.
But there is a silver lining. The silver lining is, I think, kind of the reason that behavioral economics is interesting and exciting. Are we Superman, or are we Homer Simpson?
When it comes to building the physical world, we kind of understand our limitations. We build steps. And we build these things that not everybody can use, obviously.
We understand our limitations, and we build around them. But for some reason, when it comes to the mental world, when we design things like healthcare and retirement and stock markets, we somehow forget the idea that we are limited.
I think that if we understood our cognitive limitations in the same way we understand our physical limitations, even though they don't stare us in the face the same way, we could design a better world, and that, I think, is the hope of this thing.
Thank you very much.
Behavioral economist Dan Ariely, the author of Predictably Irrational, uses classic visual illusions and his own counterintuitive (and sometimes shocking) research findings to show how we're not as rational as we think when we make decisions.
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.
The dismal science of economics is not as firmly grounded in actual behavior as was once supposed. In "Predictably Irrational," Dan Ariely told us why.