Explaining the Dunning-Kruger effect and overcoming overconfidence w/ David Dunning (Transcript)

ReThinking with Adam Grant
Explaining the Dunning-Kruger effect and overcoming overconfidence w/ David Dunning
July 16, 2024

[00:00:00] David Dunning: 

This question I've always wanted to know is when people are wrong, how much might they know or suspect that they're wrong, or are they completely blind to it? And so we decide to go forth and see.


[00:00:14] Adam Grant: 

Hey everyone, it's Adam Grant. Welcome back to Rethinking my podcast on the science of what makes us tick with The TED Audio Collective. I'm an organizational psychologist and I'm taking you inside the minds of fascinating people to explore new thoughts and new ways of thinking. My guest today is psychologist David Dunning, if you've heard of the Dunning-Kruger Effect, yep, he's that Dunning. 

It's the famous finding that those who can't do are the most likely to overestimate what they can do. We're often ignorant of our own ignorance. 


[00:00:45] David Dunning: 

Living with Dunning-Kruger, and we all do live with Dunning-Kruger. I am quite comfortable with being wrong or other people are even the experts being wrong and changing path.

You don't want to think in terms of absolutes. You want to think in terms of ideas or bets and be willing to change course. 


[00:01:03] Adam Grant: 

David is on the University of Michigan faculty. He's one of the world's most cited psychologists, and he's full of insight on overcoming overconfidence.

I would love to know what got you interested in confidence as a topic? 


[00:01:18] David Dunning: 

Well, I've always been interested in to what extent do people's perceptions of the world match the reality of the world? And a big part of people's world is themselves and what they believe about themselves. And so confidence is a natural thing to study.

That is you believe you know something. How good are the quality of those beliefs when you're sure, should you be sure? And when you have doubt, are those appropriate doubts? Our actions depend on those sorts of beliefs, and so it was a natural thing for me to become obsessed with because I didn't know what to believe and so I wanted to know what I could become sure of.


[00:01:56] Adam Grant: 

Did you have a defining moment growing up when you just got stuck with an extremely arrogant, ignorant teacher or family member or colleague? 


[00:02:06] David Dunning: 

Uh, I don't think there was any particular moment, but there were moments that were eerily repetitive and a, a lot of them were people saying outrageous things and seemingly not knowing that they were outrageous, or at least not showing any perception that someone else might think they're a little bit odd or maybe completely bonkers, if you will.


[00:02:29] Adam Grant: 

One of your answers to it is conspicuous in psychology because it's not named after the phenomenon. It's named after you and your co-author, which is very rare. That's how important it is. Tell, tell us, for starters, how did you discover the Dunning-Kruger effect? 


[00:02:43] David Dunning: 

Justin and I did not name it the Dunning-Kruger effect. I've asked ChatGPT to go back and find out who named it that, and in its very gracious ways, it first came up and said that I, I, David Dunning named it the Dunning-Kruger Effect. And I, I assured it that I did not, and it, it apologized and it's very polite and professional way. That came about because I did notice that people would say things that were obviously wrong or I'd have students in my office complaining about an exam and waiting for me to apologize for designating the wrong answers as correct on the exam that I had just given. And I was thinking that was odd because I was, the one who was the professor with the PhD, and there might be some at least acknowledgement of that.

I just wanted to know how much people had any insight into the idea that they might be wrong. It has floored me throughout these years that this turns out to be a question that everybody else is interested in. Now, the irony, however, is that everybody is interested in the fact that other people don't seem to know when they might be wrong or lack expertise.

And by the way, the Dunning-Kruger effect is those who lack expertise, lack the expertise that they need to know that they lack expertise. And people see it in other people all the time. The core of the effect is you fail to see it in yourself, and that continues to be the case. It really is a, a phenomenon about self-reflection, but people are, are, are quite en- enthusiastic about seeing it in others, but they should pause and think about, there may be times it might be happening in themselves as well.

[00:04:28] Adam Grant: 

That is one of the things I want to talk about today. Uh, definitely. I, I think the earliest moment when I, I remember a Dunning-Kruger experience was in second grade. I took a spelling test and I came home upset that I, I got a question wrong spelling. Uh, I had never missed a question on a spelling test, and I missed one on a test, and the teacher marked lightning incorrect, 'cause I spelled it L-I-G-H-T-N-I-N-G.

And she said it should be L-I-G-H-T-E-N-I-N-G. And it turned out she was not a good speller. She had no idea that she was not a good speller. But this should be an easy blind spot to correct, because there's a dictionary, you can find out whether you're a good speller or not. And this is one of the things that just baffles me about your research.

It's, it's easy to understand your findings that people who lack emotional intelligence overestimate their performance on emotional intelligence tests more than people who have it. Because it's hard to get objective feedback on your emotional intelligence, but spelling you can find out that you're wrong and, and going to some of your more recent work, financial literacy, you have objective evidence that you lack financial literacy, and yet you still overestimate your financial literacy.

How does this happen? 


[00:05:51] David Dunning: 

The Dunning-Kruger effect is born of our human genius. The, the thing about the human species is that we're born problem solvers who can deal with a lot of very different novel situations every day. So imagine you're at a lake and you have a friend who jumps in the water because it's a hot day and it's a day for swimming, and they jump in and then they realize they don't know how to swim.

And they're suddenly, uh, screaming, “Help me, help me. I don't know how to swim.” And you live for a life reserver on the dock by the lake, but there is no life reserver. But you see a basketball and you see a bowling ball, you know which one you should throw to them to save them. And, and, and that's the issue with life.

You can sort of figure things out and more or less, we typically solve things correctly. The problem is we can solve things incorrectly, but those solutions have the look and smell of correctness to us. And because they have the look and smell of correctness, we think we already know or we ask a friend because we only need to do a minimal check.

And so the problem about our ignorance and about our incompetence is it's born out of our admitted genius as a species, which often we can rely on quite well until we can't. And that's the part of the human condition that we're dealing with here. 


[00:07:15] Adam Grant: 

One of your most enlightening discoveries for me was when you clarified that, “Complete beginners don't fall victim to the Dunning-Kruger effect. Because if you've never tried a test before, it's pretty easy to know that you're incompetent at it. But that as you start to get a little bit of practice or experience or knowledge, your confidence then rises faster than your competence.” 

Why is this? 


[00:07:43] David Dunning: 

There are some situations where we think we're good at this, so we enter into something and our optimistic beliefs about ourselves get corrected.

But a lot of the time we're starting off something and we know we really don't know what we're doing. We know it's gonna be challenging. Like, “Congratulations, you went pilots lessons, let's go and fly a plane.” You're not gonna start off very, very optimistic or, “Let's start surgery school.” You're gonna be cautious.

But after a few surgeries, you might think you have the hang of it and you are more competent, but you're not as competent as you think. That is there's a learning curve to completely new tasks, but there's a subjective learning curve. And the problem with a subjective learning curve is it runs far ahead of the real learning curve.

You may have had some experiences and some successes, and you've learned some lessons, but a lot of those lessons may be based on luck. So there is something that, uh, we've called the beginner's bubble where confidence out races competence at the beginning quite a lot, and people suddenly begin to realize they need to reconsider and begin to backtrack about their optimistic views of their competence.

And you see these bubbles already in the real world. In flying there's something known as the killing zone. Which is pilots won't be, especially dangerous until they've flown about 600, 800, uh, flight hours. Then they start getting overconfident, they start getting a little sloppy, if you will, and then they start making mistakes that end up with serious injury or even fatalities because they're overconfident.

Surgery mistakes start about 15, 20 surgeries in when surgeons begin to relax and become overconfident. And so as supervisors, you have to supervise longer than you may think because people may relax in supervising themselves. That's the beginner's bubble. 


[00:09:45] Adam Grant: 

It's obviously a little disconcerting to think about vi vigilance dropping at a time when people haven't yet reached a level of expertise that they can afford to drop it.

The, the supervision solution is one that works if you have supervisory responsibility, but I think a, a lot of the challenge of of taking your findings and applying them to our lives is that we have to manage this on our own. And you have pointed out that this is a bit of a paradox because by definition you can't know when you're incompetent or at least you're blind to your blindness in certain ways.

Um, but you do have some correctives for this. So actually David, let me ask you, when have you fallen victim to your own effect? 


[00:10:32] David Dunning: 

I do know that if Justin and I are right about what produces the Dunning-Kruger effect, I am the last person you should ask about when do I fall victim to the Dunning-Kruger effect.

I do have a pre chorus of people who like to tell me when I'm falling prey to the Dunning-Kruger effect, and maybe that is the corrective you seek. A lot of the issue with the fact that our ignorance or incompetence or what we don't know is invisible to us, means that we do have to seek information outside of ourselves, and that can be done in two ways.

One is to have a supervisor or rather have a team, if you will. That can be an early warning system, uh, to ask the right question or note or warn us when we might be making an error. The best research teams, for example, when they have research meetings, have a person who's designated to be the devil's advocate.

And what they do is their job is to point out the errors or the shortcomings or the oversights and what people are thinking about when they're coming up with research plans. And that's also used in good businesses. Or you make yourself the team. For example, doctors, when they diagnose medical issues, don't reach a diagnosis, they reach a differential diagnosis.

They have to not only diagnose what you have, they have to differentiate it from other plausible alternatives. They have to consider alternatives, but you think different or you think about how you could be wrong, for example, you have a project in mind and, uh, okay, how, how could I be wrong? Project yourself into the future.

You've done this project, um, and it went disastrously wrong. What's the story about how it went wrong? You may think of possibilities you wouldn't have thought of otherwise. That's called a pre-mortem, and a lot of businesses do that in organizational planning.


[00:12:31] Adam Grant: 

I've spent a lot more time than I would like to admit, encouraging leaders to not wait for the post-mortem and actually take advantage of the fact that you can gain foresight by imagining three years in the future your decision turns out to be a terrible one. What are the most likely causes of that? 


[00:12:49] David Dunning: 

Oh, I think that's right or you can think about the past. You're setting up a project. What are projects that are like this, that have happened in the past that have failed? What went wrong with them? That's data. 


[00:13:02] Adam Grant: 

I'd actually love to see a direct comparison of pre-mortems and other people's failed past experiences. Have you seen a, a horse race of the two yet? 


[00:13:12] David Dunning: 

I have not actually. 


[00:13:13] Adam Grant: 

This seems like a paper waiting to be written. 


[00:13:15] David Dunning: 

Uh, that would be very interesting. 


[00:13:16] Adam Grant: 

Because I. 


[00:13:15] David Dunning: 

Yes. 


[00:13:17] Adam Grant: 

I will tell you anecdotally, um, what I've observed, um, and I, I have no evidence to back this up other than, uh, my closet qualitative research that I've been doing accidentally on the fly.

Um, but what I've noticed consistently is that people sometimes get defensive in pre-mortems. And they think that they're immune to the kinds of, you know, of errors that they're likely not at all invulnerable to. Whereas when they look at other people's mistakes, they don't bring their, their own baggage and blinders to them.

When I've talked with leaders, for example, about disruption, um, I've had much more success saying, “Hey, you don't want to rethink any of your time honored best practices, Blockbuster, Blackberry, Kodak, Sears, Toys “R” Us. Do I need to say more?” And oftentimes that reminds them, “Hey, these were actually leaders who were very good at thinking, but were too slow at rethinking. That's something that could happen to us too.”

There seems to be an advantage of, I guess, of analog reasoning over direct, admitting that I might be wrong. 


[00:14:22] David Dunning: 

Oh, I, I absolutely agree with you. Because there is work showing that if you ask people to come up with their best case scenario and their worst case scenario.

“Okay now, uh, come up with your most realistic scenario.” What you get is their best case scenario again, okay? They've thought of their worst case scenario and their best case scenario. The worst case scenario, no, it doesn't matter. People just don't take it into account. They can articulate it, but they don't take it into account.

And part of that is defensiveness, but part of that is that we have agency, we have control, and we rely on that. And that's a, a difficulty and would be a difficulty I think in thinking through a pre-mortem. But if you're looking at what other people did, you begin to realize that data in the world and other people's experiences suggests a different lesson.


[00:15:08] Adam Grant:

I, I think this experiment might need to happen. 


[00:15:11] David Dunning: 

Someone should do this experiment 


[00:15:12] Adam Grant: 

From reading your work over the years and from listening now to the way you describe it, it seems one of the fundamental psychological problems we run into is we over generalize from our own experiences of success. “I was good at that, therefore I'm gonna be good at this. I was smart in, you know, in one area. Therefore, I'm gonna be a good learner in another area.” 

Is part of what you've been pushing our world toward, a narrower recognition of what it means to be intelligent or competent or an expert? 


[00:15:41] David Dunning: 

You really do have to think about the situation or the circumstances or the tasks that you're getting yourself, you're getting yourself into.

We've all, uh, accumulated knowledge, but we all have skills. We all have expertise, but we have to really think hard about, “Okay, what's the next step?” And if the next step is pretty familiar. That is, it looks a lot like the things we've done before. Great. Unfamiliar ground. We have the experience to guess what's gonna happen to us.

New things may happen, but uh, it's not gonna be wildly new though. Things can change. But if we completely switch over to something completely new, oh, things are going to be different in ways we have not anticipated at at all. There're gonna be a tremendous number of unknown unknowns. There are gonna be challenges we never prepared for because we had no idea they existed.

So not only do we have to rely on what we think about us, we have to really think hard about the situation that we're getting ourselves in and do a good analysis of it and its demands, and we're not necessarily focused on that. It's moving to the new that produces a lot of the problems that we talk about in in our business.


[00:17:00] Adam Grant: 

One of the things I guess that this speaks to is what you've called epistemic trespassing. 


[00:17:06] David Dunning: 

Hmm. 


[00:17:06] Adam Grant: 

I love this phrase. Tell us about what that is. 


[00:17:11] David Dunning: 

I have to credit Ethan Valentine, a philosopher for coming up with it and then, uh, dragging me into it. Is the idea of taking your knowledge and skill and trespassing into the field of expertise of another field.

And the best examples of epistemic trespassing actually come from the COVID-19 pandemic era where people who knew math or knew a little bit about evolution, decided to start making predictions and policy pronouncements about COVID-19 and about treatment. So they were trespassing into epidemiology without knowing epidemiology. That's epistemic trespassing.


[00:17:57] Adam Grant: 

This speaks to the danger of something you've been writing about lately, which is the much beloved acronym, D-Y-O-R. Do your own research. 


[00:18:07] David Dunning: 

Yes.


[00:18:07] Adam Grant: 

You have some issues with that. 


[00:18:10] David Dunning: 

Everybody should do their own research, but you should be careful when you do it. Because, well, first off, no one is actually doing their own research, they're actually just examining the research of other people. 

I mean, let's say that you're not bricking out your own lab, getting out the beakers, getting the reagents and doing the biology. No, you're not doing your own research. You're reading, uh, other people's accounts, second hand, third hand, fourth hand about what's going on.

And you may not necessarily be in a position to know who the experts are. You have to think a little bit more like a journalist. And what journalists do is they do look at more than one source. They don't report something unless they have more than one source, for example. So who's the best expert to follow? The best expert to follow is experts in the plural, for example. Is there a consensus? 


[00:18:56] Adam Grant:

What's so complicated about this is it increases your probability of being right. Where there's expert consensus, but it also, I think, opens us up to the possibility of being extremely wrong when there's a, a field of experts who have fallen victim to groupthink. I think it's equally dangerous to say, “Stay in your lane.” Because. 


[00:19:16] David Dunning: 

Yeah. 


[00:19:16] Adam Grant: 

We don't end up then benefiting from outside perspectives and diversity of thought and dissenting views. 


[00:19:23] David Dunning: 

Oh no, that's absolutely right and you want to make the best bet, but know that it's a bet and things may turn out differently and you do wanna listen to descent as well.

Living with Dunning-Kruger, and we all do live with Dunning-Kruger is um, I. I am, uh, quite comfortable with being wrong or other people, or even the experts being wrong and changing path. Um, so, uh, you don't want to think in terms of absolutes. You wanna think in terms of ideas or bets and be willing to change course.


[00:19:55] Adam Grant: 

That's a nice way to reconcile these two perspectives to say, “Look, there are times when the establishment is not correct.” And I guess the, the way that I've tried to navigate this personally is, is something that, that Phil Tetlock first taught me a long time ago, which was to look at the credibility of individual experts and groups of experts as a function of their motives and ask, “Are they invested in particular answers or are they invested in trying to pursue the question?”


[00:20:24] David Dunning: 

I would also put on the table, how exactly did they come to that conclusion? Sometimes we can overweight motives, and as a scientist I'm sort of interested in, okay, did they follow the correct procedure? If a scientists, did they do the science right? If their lawyers of the courts, did they do the law right?

Did they do the craft right? 


[00:20:43] Adam Grant: 

Let's judge credibility by the methods as well as the motives. Of course, there's a voice in my head that says, “The motives often drive shoddy methods.” But you're right. At the end of the day, the methods are closer to is this trustworthy information than the motives are? This goes to the question of can we trust the confidence of experts?

We started, and I think where people often zoom in on, on your work, um, is with, you know, people who are not very knowledgeable or skilled being overconfident. And I think the assumption is that as at some point as you gain scale and expertise, you become better calibrated. The evidence around this, I, I can think of a number of studies showing that accuracy goes up as people gain expertise.

I know some literature that still shows expert overconfidence. I've read, at least I can think of a handful of papers that have shown experts being under confident, uh, because they, they have high levels of intellectual humility and they're constantly doubting their, their assumptions and hypotheses. Can you organize this literature and help us understand when are experts under confident, overconfident accurate?


[00:21:58] David Dunning: 

With Carmen Sanchez at the University of Illinois we just wrote a review trying to organize this literature, and the issue is no one agrees about what an expert is. I mean, is it a person with a title? Is it a person who's done this thing for a long time, or is it a person who knows a lot? You do wanna take a look at performance that clearly matters and in the main experts are more accurate, but they do remain overconfident. 

I mean, between a person who doesn't know and a person who tends to know, I'd rather talk to the person who tends to know. If you really want knowledge, that's the knowledge you want. Do you know when you're right? Do you know when you're wrong?

Experts are better at it, but with a twist. Experts are better at knowing when they're right. When they're right, they're much more confident than non-experts. When they're wrong, when they've made a wrong choice, they're much more confident than our non-experts. That is, they don't show more appropriate doubt when they're wrong compared to people who are novices. 


[00:23:03] Adam Grant: 

Is the ideal scenario when you're a medical patient, for example, instead of asking for a first opinion, to just start with a second opinion?


[00:23:11] David Dunning: 

If you start with the second opinion, you still have to go to the first opinion because you wanna see if you get two for two on the opinion. If you get a disagreement, you wanna hit pause. I think is, is.


[00:23:22] Adam Grant: 

What's interesting to me practically about that finding is saying, “Look, you know, you, you don't know if you can trust the expert's judgment, but what you can do is trust the expert's judgment of other people's judgment.”


[00:23:34] David Dunning: 

Oh, and that turns out to be, uh, what we're finding in the data, this is unpublished, but I've literally done this study where I have people solve logical puzzles and judge how good they're doing on each of the puzzles and have them judge how another person has done on these logical puzzles, and they have much more insight into what other people are getting wrong and they themselves are getting wrong, and how could it be otherwise?


[00:24:04] Adam Grant: 

Time for a lightning round. 


[00:24:06] David Dunning: 

Oh, okay. 


[00:24:07] Adam Grant: 

First question is, what is the worst advice you've ever been given? 


[00:24:12] David Dunning: 

Be as boring as possible. 


[00:24:17] Adam Grant: 

Who gave you that advice? 


[00:24:16] David Dunning: 

Someone, I forget who, when I was writing a grant, they may have been right, by the way.


[00:24:21] Adam Grant: 

I'm glad you didn't follow it. What is something you've rethought lately?


[00:24:26] David Dunning: 

The biggest thing I ever rethought and it, it wasn't late, but the biggest thing I've rethought is I used to dismiss motivated reasoning. I don't dismiss motivated reasoning anymore. I used to be motivated, reasoning is just way overplayed. Uh, people don't do that. Now, I've completely switched off that.

And I know you're surprised, given my research history. 


[00:24:45] Adam Grant: 

Does the distinction between confirmation bias and desirability bias matter as much to you as it does to me? 


[00:24:52] David Dunning: 

The answer is yes. It does absolutely matter. 


[00:24:55] Adam Grant: 

I was skeptical of confirmation bias for a long time, and I, I didn't believe that expectations were as powerful as a lot of psychologists did.

But motivated reasoning is more about desirability bias, people seeing what they want to see and not just what they expect to see. We're pretty remarkable rationalizing creatures, and that I found the evidence much more persuasive for. 


[00:25:14] David Dunning: 

That's true, but there's a way in which they are entangled that is, we don't like things that don't make sense to us.

So if we expect to see A, because it goes with B and they don't go together, uh, we find that aversive, that's cognitive dissonance, and we find ways to bring the world back into harmony. 


[00:25:32] Adam Grant: 

Tell me what is an unpopular opinion you hold or a hill you would be keen to die on? 


[00:25:39] David Dunning: 

Uh, I think that the ending of the TV series Lost was a classic, brilliant piece of television writing.


[00:25:47] Adam Grant: 

No. 


[00:25:48] David Dunning: 

Yes. 


[00:25:49] Adam Grant: 

As a Lost super fan.


[00:25:50] David Dunning: 

Oh absolutely. 


[00:25:51] Adam Grant: 

I am having a hard time even listening to those words right now. What, why do you think the ending of Lost was genius? 


[00:25:59] David Dunning: 

It is genius because what, it was meta, it said that all these puzzles, all these mysteries, they were all a distraction. They didn't matter. What mattered was the people you were with. The whole series didn't matter. It was the people all along. And the search for purpose, that was a, that was a series about the search for purpose.


[00:26:24] Adam Grant: 

I'm gonna begrudgingly admit that you made a very good point. 


[00:26:27] David Dunning: 

Well, okay. Here, here I go. Now I'm going to be real unpopular. All those people invested in the little mysteries and puzzles in their career when they're retired and, and approaching their demise, they're not going to be worried about them. They're going to be worried about purpose. 


[00:26:43] Adam Grant: 

What’s a future prediction you have? I'm mindful of the irony of asking you this question of all people, given your, your knowledge of how often we blunder when we make forecasts. 


[00:26:55] David Dunning: 

Yeah. 


[00:26:55] Adam Grant: 

That I can't resist. 


[00:26:57] David Dunning: 

In the future, uh, there are going to be many news stories written about codependency with ChatGPT 


[00:27:07] Adam Grant: 

Codependency?


[00:26:57] David Dunning: 

ChatGPT can be anything or anybody you want it to be. Right now I'm asking ChatGPT to tell, to play act famous thinkers through the centuries, and tell me what they think of the Dunning Kruger effect. It's a summer pastime and to ChatGPT is doing very uh, doing wonderfully well. It's written a tremendously good John Lennon song and then explained why John Lennon would've written that song.

I was impressed, and I can imagine other people can find ways in which ChatGPT can be the friend you never thought that you had, especially a literate friend. 


[00:27:46] Adam Grant: 

Earlier you mentioned that there's some people who are fond of telling you when you are a Dunning-Kruger victim. What, what's the most common example for you?


[00:27:56] David Dunning: 

There is a small cadre of people who say that the, the Dunning-Kruger effect is just a statistical artifact that is something, but regression to the mean, and this will be the too long didn't read version of my response. Is that those who make that critique, haven’t actually read the 25 year literature on the Dunning-Kruger effect or the even longer literature, multidisciplinary literature on the regression to the mean.

So they're making a critique without knowing what they, they don't reference, um, into, uh, a very large literatures. But a lot of the complaints are about three things. One is whether or not the Dunning-Kruger effect exists, whether motivated reasoning exists, and I can agree that you can overplay motivated reasoning and its impact in the world, but it does operate in the world. 

And finally, whether biases that we all have about ourselves are problematic. A lot of people say that they're helpful. People tend to hold over flattering views about themselves at, at least in the United States and Western Europe that is overwhelmingly the bias people have.

They think they're very healthy, they're very intelligent, they're very moral, they're very good drivers. Won't be hurt by climate change or never be a victim to a crime relative to other people, for example. They make these statements to, to a degree that is mathematically impossible. And a lot of people will state that that's not only not a problem that is helpful. 

Uh, my position is that that can actually be helpful or it can be not consequential or it can be harmful, but we don't want to get caught in the trap of making it either. 


[00:29:44] Adam Grant: 

I I wanna push a little bit further on, on Dunning-Kruger in your everyday life outside of, uh, your professional life. Um, do you overestimate your driving skill? Are there things that you think you're good at that other people have told you you're bad at? 


[00:29:57] David Dunning: 

One issue I have is I'm old, so things that people have told me I'm bad at, I don't do anymore. I know the literature on when you're older, you have more positivity bias, and I often wondered when I was younger, is that because when you're older, you just don't do the things that you're bad at?

I, I don't have the data, I just have my personal data. The answer is, yeah. If you're bad at it, you just don't do it anymore. You just avoid it. You let you, you pay someone else to do it if, if it needs to be done.


[00:30:21] Adam Grant: 

Uh, co couple other things, uh, on my list, uh, I, I think I was, uh, I was enjoying your behavioral scientist conversation and there were, there were two things that jumped out in at me that I wanted to, to ask you to riff on a little.

You quoted Vern Law, a baseball pitcher. I, I thought that was such a brilliant quote. I, I wanna hear you talk about it a little bit. 


[00:30:40] David Dunning: 

Oh yes. Uh, the Vern Law quote is, “Life is the cruelest teacher because first it gives you the test and then it gives you the lesson.” Nowhere else is a truer thing ever been said.

And the thing that I can assure people out there, especially those who live their lives, that it, it never ends. I'm a approaching the, the sunset of my career and am nowhere near approaching the finish line of learning. There are new lessons every day. They say, “Life begins at 40.” And when I hit 40, I began to understand why they say that.

If you're not 40 yet, I'm not gonna tell you. You have to get there. And then you'll go, “Oh, I get it.” So when those older people give you a few helpful hints, do jot them down. They might be saying the truth. It will be good to remember after the test. 


[00:31:34] Adam Grant: 

You made a comment about how a lot of human interaction is not actually interaction, but asynchronous proclaiming.

Talk to me about that.


[00:31:42] David Dunning: 

That really was about the internet and social media. Until recently, socially, we've been trained to do one-on-one interaction in person face-to-face, and so we have a bunch of norms about how we interact with one another about politeness and repair. When social harmony is a little askew and so forth, but that's all about being synchronous interacting.

But social media isn't that. You proclaim. Pause, pause, pause, pause, pause, and then someone proclaims back. There's no interaction, it's just different billboards being put on the highway and a few miles down, another billboard being put on the highway. And so it's not a surprise that things can get a little testy, contentious and, and at times toxic because we've been taught how to get away from the toxicity when we're together in a room.

We know how to hug it out. There is no hugging it out or a chance to hug it out on Twitter. 


[00:32:41] Adam Grant: 

Let me give you the floor. What's the question you have for me? 


[00:32:45] David Dunning: 

You deal a lot with the public in terms of doing scientific communication or communicating science to the public. What's the one thing you want scientists to know that they should always keep in mind? Because as hard as they try, they always get it wrong and they have to be mindful.


[00:33:10] Adam Grant: 

I think I spend about as much time doing science communication as they do doing social science. My one piece of advice to scientists for communicating to the public is don't start talking at paragraph two. 


[00:33:25] David Dunning: 

Mm-Hmm. 


[00:33:25] Adam Grant: 

This is what I think what so many scientists do. They, they lead with their findings as opposed to, “Here's why this question is important, or here's why you might expect the following to happen, but actually I discovered something different.” I, I think so much of, of effective science communication is, is actually explaining what the problem is that you're trying to solve or what the puzzle is.


[00:33:49] David Dunning: 

Mm-Hmm. 


[00:33:49] Adam Grant: 

You're trying to get to the bottom of, and then your audience is with you. I, I've watched so many great scientists struggle at communication because they tell you the result and people say, “Duh.” Like, but wait, that actually wouldn't have been obvious if you hadn't given away the answer. 


[00:34:06] David Dunning: 

This is part of the reason why I, I like talking to reporters. You dunno what paragraph one is until you talk to reporters of the public and, and I have found out what paragraph one is much more than I thought by talking to the public. It's, it's been very enriching. I was once at a science communication workshop and a lot of the other people there were graduate students and post uh, postdocs in physics who were doing work on dark matter.

And there was no hope in understanding what they were doing, but what you could understand was their sheer enthusiasm. And so they would be talking and all you wanted to do was to give them a hug and fund them. The, the one thing I would say is let your enthusiasm show because they were so excited and anything that can excite people that much and was clearly that hard was something worth supporting.


[00:35:02] Adam Grant: 

This is, I think, maybe one of the ironies of your research is it's often talking with people who are unskilled and unaware that you learn more about how to present your ideas to a broad audience. 


[00:35:15] David Dunning: 

Yeah, you really don't know what your work is about until you, you find out what other people, uh, think about. You know, I often walk away kinda going, “Oh, that's what I'm doing. That was valuable.” That's what you get by talking to people far outside of the useful world. 


[00:35:27] Adam Grant: 

It's a little daunting to say, “All right, you know, it is easier to write the paragraph one about other people's research. And I hope as, as a member of the field, I cover it more accurately than, than a journalist would.” 

But they're also always in the back of my mind like, “Okay, this is not my core area of expertise. There are nuances of the literature that I don't know, I haven't read every paper. What did I miss?” 


[00:35:52] David Dunning: 

It is a comfort that you have that fear. That is the best protective you have. Go forth and be confident in your anxiety. 


[00:36:04] Adam Grant: 

My doubt means I might not totally screw it up. I heard it here first. 


[00:36:10] David Dunning: 

That's right. 


[00:36:10] Adam Grant: 

Thank you, David. Go blue. 


[00:36:12] David Dunning: 

Thank you. Ah, always.


[00:36:18] Adam Grant: 

My two big lessons from David Dunning. Number one, don't confuse experience with expertise. Having faced a problem in the past doesn't guarantee that you've mastered the right solution for the present. Number two, don't mistake expertise for wisdom. Having deep knowledge doesn't guarantee that you know when or where it applies.

Rethinking is hosted by me, Adam Grant. This show is part of The TED Audio Collective, and this episode was produced and mixed by Cosmic Standard. Our producers are Hannah Kingsley-Ma and Aja Simpson. Our editor is Alejandra Salazar. Our fact checker is Paul Durbin. Original music by Hansdale Hsu and Allison Leyton-Brown.

Our team includes Eliza Smith, Jacob Winik, Samiah Adams, Michelle Quint, Banban Cheng, Julia Dickerson, and Whitney Pennington Rodgers.


[00:37:21] David Dunning: 

A journalist once described my office as, “You can't really tell how many nuclear bombs have gone off.” 


[00:37:29] Adam Grant: 

Ouch. 


[00:37:30] David Dunning: 

“Inside.” Yeah. And that hasn't corrected itself. It has only gotten worse. But unfortunately that has not a Dunning-Kruger situation since it's patently obvious to everyone, including me, that that's the case.


[00:37:42] Adam Grant: 

So you, you, you readily admit that you're disorganized. 


[00:37:45] David Dunning: 

Well, uh, visually disorganized, absolutely.