Category

Cognitive Biases

Impact level

2 / 5

Last updated

Nov 2025

Category Cognitive Biases

Impact 2 / 5

COGNITIVE BIASES

Illusion of Explanatory
Depth

The illusion of explanatory depth (IOED) is a cognitive bias where people overestimate their understanding of complex causal systems. We feel we know how things work (like a zipper, a toilet, or the economy), but when asked to explain them in detail, we realize our knowledge is shallow.

Also known as: IOED, Knowledge illusion

01

Overview

Illusion of Explanatory Depth

The Psychology Behind It

We live in a world of complex objects and systems. To function, we don't need to know how everything works; we just need to know how to use it. We know that turning the handle flushes the toilet. This functional knowledge creates a false sense of deep understanding.

In a famous 2002 study, researchers asked people to rate how well they understood everyday objects like zippers and cylinder locks. People rated their understanding high. Then, they were asked to write a detailed, step-by-step explanation of how the object worked. Most couldn't do it. After struggling to explain, they lowered their self-ratings significantly.

This illusion persists because we confuse familiarity with understanding. We also rely on a "community of knowledge"—we know that someone knows how it works, so we feel like we know how it works.

Real-World Examples

Politics and Policy

People often have strong opinions on complex policies like healthcare reform or tax codes. When asked to explain the mechanics of the policy they support, they often can't. Interestingly, attempting to explain the policy often moderates their extreme views, as they realize the issue is more complex than they thought.

Technology

We use smartphones every day and feel we understand them. But if asked to explain how a touch screen actually registers a finger, most of us are stumped.

Science Literacy

Many people accept scientific facts (like "the earth revolves around the sun") without being able to explain the evidence or mechanism, yet they feel they "know" science.

Consequences

The illusion of explanatory depth can lead to:

  • Overconfidence: We take strong stances on issues we don't understand.
  • Polarization: Because we think our view is obvious and simple, we think those who disagree are stupid.
  • Poor Decision Making: We vote for policies or buy products based on a superficial understanding of their effects.

How to Mitigate It

The cure for this illusion is the attempt to explain.

  1. The "Explain It to Me" Challenge: Before debating a topic, try to explain the mechanism to yourself or a friend in detail. "How exactly does a tariff work? Who pays it? When?"
  2. Ask "How," Not "Why": Asking "why" leads to values and reasons. Asking "how" leads to mechanisms and reveals gaps in knowledge.
  3. Intellectual Humility: Admit that most of what we "know" is actually knowledge stored in the world or in other people's heads.

Conclusion

The illusion of explanatory depth teaches us that our knowledge is thinner than we think. Recognizing this can make us more humble, more curious, and less dogmatic in our disagreements.

Cognitive processing

System 1 (fast, intuitive). Biases often lean on quick judgments (System 1) unless you slow down and analyze (System 2).

Evidence & time

Evidence strength: experimental. Typical read: about 2 min.

02

Mitigation strategies

Causal Explanation: Force yourself to write down the causal chain of events: A causes B, which causes C. Gaps will appear immediately.

Effectiveness: high

Difficulty: moderate

Feynman Technique: Explain the concept in simple terms as if teaching a child. If you use jargon to cover up a gap, you don't understand it.

Effectiveness: high

Difficulty: moderate

03

Potential decision harms

Voters support populist policies that sound good as slogans but are economically disastrous because they don't understand the implementation mechanics.

critical Severity

Students think they have mastered a topic after reading the chapter, only to fail the test because they never tested their ability to explain it.

major Severity

04

Key research studies

The misunderstood limits of folk science: An illusion of explanatory depth

Rozenblit, L., & Keil, F. (2002) Cognitive Science

Coined the term IOED and demonstrated that people overestimate their understanding of complex causal relations.

Read Study →

Tags