Illusion of Explanatory Depth

Also known as: IOED, Knowledge illusion

The illusion of explanatory depth (IOED) is a cognitive bias where people overestimate their understanding of complex causal systems. We feel we know how things work (like a zipper, a toilet, or the economy), but when asked to explain them in detail, we realize our knowledge is shallow.

Cognitive Biases

2 min read

experimental Evidence


Illusion of Explanatory Depth

The Psychology Behind It

We live in a world of complex objects and systems. To function, we don't need to know how everything works; we just need to know how to use it. We know that turning the handle flushes the toilet. This functional knowledge creates a false sense of deep understanding.

In a famous 2002 study, researchers asked people to rate how well they understood everyday objects like zippers and cylinder locks. People rated their understanding high. Then, they were asked to write a detailed, step-by-step explanation of how the object worked. Most couldn't do it. After struggling to explain, they lowered their self-ratings significantly.

This illusion persists because we confuse familiarity with understanding. We also rely on a "community of knowledge"—we know that someone knows how it works, so we feel like we know how it works.

Real-World Examples

Politics and Policy

People often have strong opinions on complex policies like healthcare reform or tax codes. When asked to explain the mechanics of the policy they support, they often can't. Interestingly, attempting to explain the policy often moderates their extreme views, as they realize the issue is more complex than they thought.

Technology

We use smartphones every day and feel we understand them. But if asked to explain how a touch screen actually registers a finger, most of us are stumped.

Science Literacy

Many people accept scientific facts (like "the earth revolves around the sun") without being able to explain the evidence or mechanism, yet they feel they "know" science.

Consequences

The illusion of explanatory depth can lead to:

  • Overconfidence: We take strong stances on issues we don't understand.
  • Polarization: Because we think our view is obvious and simple, we think those who disagree are stupid.
  • Poor Decision Making: We vote for policies or buy products based on a superficial understanding of their effects.

How to Mitigate It

The cure for this illusion is the attempt to explain.

  1. The "Explain It to Me" Challenge: Before debating a topic, try to explain the mechanism to yourself or a friend in detail. "How exactly does a tariff work? Who pays it? When?"
  2. Ask "How," Not "Why": Asking "why" leads to values and reasons. Asking "how" leads to mechanisms and reveals gaps in knowledge.
  3. Intellectual Humility: Admit that most of what we "know" is actually knowledge stored in the world or in other people's heads.

Conclusion

The illusion of explanatory depth teaches us that our knowledge is thinner than we think. Recognizing this can make us more humble, more curious, and less dogmatic in our disagreements.

Mitigation Strategies

Causal Explanation: Force yourself to write down the causal chain of events: A causes B, which causes C. Gaps will appear immediately.

Effectiveness: high

Difficulty: moderate

Feynman Technique: Explain the concept in simple terms as if teaching a child. If you use jargon to cover up a gap, you don't understand it.

Effectiveness: high

Difficulty: moderate

Potential Decision Harms

Voters support populist policies that sound good as slogans but are economically disastrous because they don't understand the implementation mechanics.

critical Severity

Students think they have mastered a topic after reading the chapter, only to fail the test because they never tested their ability to explain it.

major Severity

Key Research Studies

The misunderstood limits of folk science: An illusion of explanatory depth

Rozenblit, L., & Keil, F. (2002) Cognitive Science

Coined the term IOED and demonstrated that people overestimate their understanding of complex causal relations.

Read Study →


Related Biases

Explore these related cognitive biases to deepen your understanding

Loaded Language

Loaded language (also known as loaded terms or emotive language) is rhetoric used to influence an audience by using words and phrases with strong connotations.

Cognitive Biases

/ Emotive language

Euphemism

A euphemism is a mild or indirect word or expression substituted for one considered to be too harsh or blunt when referring to something unpleasant or embarrassing.

Cognitive Biases

/ Doublespeak (related)

Paradox of Choice

10 min read

The paradox of choice is the idea that having too many options can make decisions harder, reduce satisfaction, and even lead to decision paralysis.

Cognitive Biases / Choice and complexity

/ Choice Overload

Choice Overload Effect

10 min read

The choice overload effect occurs when having too many options makes it harder to decide, reduces satisfaction, or leads people to avoid choosing at all.

Cognitive Biases / Choice and complexity

/ Paradox of Choice

Procrastination

2 min read

Procrastination is the action of unnecessarily and voluntarily delaying or postponing something despite knowing that there will be negative consequences for doing so.

Cognitive Biases

/ Akrasia (weakness of will)

Time-Saving Bias

2 min read

The time-saving bias describes the tendency of people to misestimate the time that could be saved (or lost) when increasing (or decreasing) speed.

Cognitive Biases

/ Time-saving illusion