We all think others are foolish, but we never realize that others think the same about us
David Dunning, a professor of psychology at the University of Michigan, has long been dedicated to studying flaws in human thinking.
One well-known psychological phenomenon, closely related to contemporary political trends, is the Dunning-Kruger Effect. One of its proposers is Professor David Dunning himself. The Dunning-Kruger Effect describes a cognitive bias in which people with low ability—such as those who struggle with logical puzzles—often fail to recognize their own shortcomings and instead overestimate their competence.
The following line graph is taken from the original paper that introduced this concept. It shows that the worst performers—those whose actual test scores fall below the median—greatly overestimated their own abilities (while, interestingly, the best performers underestimated theirs)
The Dunning-Kruger Effect can be explained as follows: when we lack skill in a particular area, we also lack the ability to accurately assess our own competence.
This lack of experience often breeds an illusion of superiority. A well-known example is former U.S.
President Donald Trump. Despite his lack of interest in and understanding of governance, his confidence and arrogance have never wavered.
However, to truly understand the Dunning-Kruger Effect, Professor Dunning believes that we don’t need to mention Trump or look for examples in the news.
Instead, he sincerely suggests that we look inward—to ourselves—to recognize the effect’s influence in our own thinking.
In 2018, I had the opportunity to interview Professor Dunning by phone. He stated, "The first rule of the Dunning-Kruger Effect is that you don’t know you are part of it—and this is exactly what people tend to overlook."
During our conversation, I invited him to discuss a rare yet valuable human trait—intellectual humility, or the ability to recognize that what we believe might be wrong.
Why is this trait so rare and valuable?
Because our brains shield us from our own blind spots.
This is precisely the root of the Dunning-Kruger Effect: we often feel confident about our skills or knowledge in a given area, but when it comes to actual performance, we fail miserably.
Yet, no matter what, we rarely realize that our confidence is misplaced.
So, I asked Professor Dunning: "How should we think about our own self-awareness in order to align it more closely with reality?"
In today’s world, where lies and misinformation spread freely while unpleasant truths are ignored, his response is something that each of us should seriously reflect upon.
The following interview transcript has been edited for clarity and brevity.
Q: Can you tell us what you do? I study the psychology behind human misconceptions. Why do people believe things that are untrue or even impossible?
Essentially, my research focuses on the question: "How could people possibly believe that?" What led me to study topics like the Dunning-Kruger Effect was the realization that we truly don’t know how ignorant we are. Our ignorance is invisible to us.
Q: What would you like people to know about the limitations of human thinking?
If there’s one psychological principle I wish more people understood, it would be naïve realism.
Naïve realism is the idea that, even if your perception of the world feels absolutely convincing—so obvious that it seems self-evident—it doesn’t necessarily mean it’s correct.
Whenever we arrive at a conclusion, it always seems right to us. But in reality, much of what we see and interpret about the world is constructed by our brains.
If you keep this in mind, you might pause from time to time to consider what kinds of mistakes you could be making, how others might perceive the same situation differently, and perhaps be more open to hearing alternative viewpoints.
Our brains are constantly engaging in creative interpretation. In recent years, several notable examples have emerged that remind us why we need to better understand naïve realism.
One such example is the infamous "blue and black vs. white and gold dress" debate. You look at the dress and think, "It’s obviously white and gold—I just can’t see it any other way!"
But in reality, our brains first make certain assumptions before arriving at a conclusion. What we perceive is a mental construct, not the world itself.
Q: One fascinating yet thought-provoking aspect of your research is that people often misunderstand the Dunning-Kruger Effect and draw incorrect conclusions from it. Do you encounter this often?
Yes, all the time. My research focuses on why people fail to realize that they don’t fully understand something.
So, the frequent misinterpretation of the Dunning-Kruger Effect is both deeply ironic and an excellent demonstration of the phenomenon itself.
That said, among these misunderstandings, there are a few particularly important ones
First, people tend to think the Dunning-Kruger Effect applies to others—that is, they believe that certain people, apart from themselves, are foolish.
They never consider that they might also be among the foolish.
Of course, such people do exist, but that’s not what my research is about.
My research focuses on the objective fact that this phenomenon will happen to all of us sooner or later.
It’s just that in some individuals, it’s more obvious, while in others, it’s less noticeable. Not realizing the extent of our ignorance is part of human nature.
The key issue, however, is that while we can clearly see this problem in others, we fail to recognize it in ourselves.
The first rule of the Dunning-Kruger Effect is that you don’t know you’re experiencing it—and this is precisely what people tend to overlook.
The second rule is that over the years, society’s understanding of the Dunning-Kruger Effect has shifted—from the idea that incompetent people are overly confident to the belief that beginners are overly confident.
Our research, published last year, shows that beginners don’t initially fall into the Dunning-Kruger Effect, but they do become susceptible to it fairly quickly.
This happens because, in a short period, they believe they have gained a clear understanding of how to handle a task—when, in reality, their skills remain far from sufficient.
Q: People often misinterpret your research. What insights can we gain from these misunderstandings regarding the limitations of human thinking?
Well, these misinterpretations reveal both the limitations and the talents of human cognition.
That is, we have the ability to take a fragment of truth and weave it into a complete and compelling narrative—one that is coherent, reasonable, rich in meaning, and even entertaining.
But just because a story makes sense doesn’t mean it’s true. It simply highlights humanity’s exceptional ability to create narratives.
Q: Are there any solutions or practical techniques to address this issue?
We can draw on the research of Philip Tetlock, a psychologist at the University of Pennsylvania, and his "superforecasters" theory.
He found that when making predictions, people who rely on probabilistic reasoning tend to perform better than those who rely on absolute certainty.
However, that’s just the starting point. The lesson here is that we need to be more cautious about what we think and what we say.
Of course, we don’t need to overanalyze every single thought. But when facing important or complex situations, it’s wise to pause and think carefully before forming conclusions.
Q: In recent years, American media has been dominated by “fake news,” “alternative facts,” and partisan conflicts. What lessons can we learn from your research in this context?
What truly concerns me is that people can no longer distinguish between facts and opinions.
If you survey Democrats and Republicans today, you’ll find that beyond their differences in policy priorities and governing philosophies, they have completely different perceptions of reality itself.
For example: • “Is the U.S. economy thriving?” • “How successful was the Obama administration?” • “Has the stock market gone up or down?”
On these factual questions, Democrats and Republicans give wildly different answers. But these are factual issues.
What has struck me most in recent years is that people are not just inventing their own opinions—they are fabricating their own versions of reality.
In political surveys, I have asked respondents many factual questions—questions where I believe they should simply say, "I don’t know." Yet, overwhelmingly, people avoid choosing that option.
Q: Do Americans refuse to answer factual questions with "I don’t know"? Could this be the subject of a new study?
Yes, this is actually a study we are currently conducting. As I mentioned earlier, we ask many fact-based questions about the U.S., such as: • “Is the teen pregnancy rate at an all-time high?”
• “What is the financial status of Social Security?”
We ask questions where the answers are objectively known.
At the same time, we use economic techniques to incorporate incentives that encourage people to answer more honestly.
What we’ve found is that, when it comes to factual information about the world, there is indeed a significant gap between the beliefs of Democrats and Republicans.
And now, I want to explore a deeper question: Can we determine whether people’s beliefs are truly authentic?
For instance, we are currently examining the "birther" movement—the conspiracy theory that Barack Obama was born in Kenya.
When someone says, "Obama was born in Kenya," can we tell from their facial expressions or behavior whether they genuinely believe it?
So far, the answer appears to be yes.
Q: Is there any way to make people more comfortable saying, "I don’t know"?
That’s an interesting question—because people really don’t like saying "I don’t know." This has been true throughout history.
I must admit, in over thirty years of research, I have included many survey questions where I thought the correct response should be "I don’t know."
But time and again, people give other answers instead. So how do we get people to admit, "I don’t know"?
Honestly? I don’t know.
Q: Does intellectual humility have a significant impact on individuals?
To give a more gossip-worthy example, I know some top journalists who do excellent work, but most of them tend to be a little neurotic.
Living in a constant state of uncertainty and doubt may not be healthy for everyone. To get closer to the truth, you almost have to be a bit obsessive.
The key is that the most impactful decisions in life are often the ones we rarely encounter.
For example: Where should I buy a house? Who should I marry? What kind of children should I raise? These crucial decisions tend to arise in areas where we lack experience—the very domains of our ignorance.
And when we’re struggling with such decisions, that’s precisely when we need advice from others.
Q: Personally, I tend to trust people who are anxious.
I agree. I’ve noticed that neurotic people display extraordinary intelligence when it comes to the subjects they obsess over—often to a surprising degree.
My own passion for my field largely stems from my belief that misfortune is always just around the corner, and that every decision matters.
So, I want to understand: How exactly am I destined to fail? Of course, I recognize that this may not be the healthiest way to live.
Q: Is there a way to maintain a critical, humble mindset while also being aware of our cognitive blind spots—without sacrificing mental well-being?
When making an important decision, ask yourself: Where could I be wrong? What could go unexpectedly wrong with my plan? Thinking things through carefully is crucial.
Double-check all your assumptions and consider: What might I not know? Broadly speaking, many of the problems and struggles we face stem from trying to tackle everything alone.
We rely solely on our own judgment and make decisions from our limited perspective. But if we consult, discuss, or even casually chat with others, we often gain valuable insights and fresh perspectives.
On a larger, more abstract level, proactive social engagement and strong social connections significantly contribute to both mental and physical well-being.
Having more social interactions also helps us update and expand our knowledge.
So, as much as possible, don’t go it alone—unless you want to make things much harder for yourself.