Information Bias
Information bias occurs when we pursue data that does not actually change what we should do. In many situations, a few key pieces of evidence are enough to choose wisely, yet we feel more comfortable gathering extra details, running more tests, or commissioning more reports. The additional information may be interesting, but if it does not alter the decision, it mainly consumes time and resources.
This bias arises from a natural desire to reduce uncertainty and to feel thorough. Asking for "one more" scan, metric, or analysis seems responsible, even when prior results already point clearly in one direction. The danger is that the search for unnecessary information can delay action, distract from the core signal, and provide cover for indecision.
The Psychology Behind It
Several factors underpin information bias. People often equate more information with better decisions, regardless of its diagnostic value. We also have difficulty recognizing when a test is non-instrumental—that is, when its possible outcomes would not change our choice.
Social dynamics play a role: requesting more information can signal diligence to colleagues or clients. Decision-makers may fear criticism for acting "too quickly," so they shield themselves with additional data, even if it is tangential.
Real-World Examples
In medicine, a clinician might order additional imaging tests for a patient even after existing results and guidelines indicate a clear diagnosis and treatment plan. The extra tests add cost and risk without meaningfully changing management.
In business, leaders may delay a product launch while waiting for more focus groups or surveys, despite earlier research already saturating the key questions. They may also drown dashboards in low-impact metrics that clutter attention.
In personal decisions, individuals may spend hours reading marginal product reviews or minor technical specifications when choosing between two very similar options, long after the decision is effectively risk-neutral either way.
Consequences
Information bias can lead to analysis paralysis, wasted resources, and decision delays. In healthcare, unnecessary testing can expose patients to radiation, false positives, and anxiety. In organizations, endless data collection can crowd out execution, leaving teams stuck in perpetual planning.
The bias also interacts with other distortions: by emphasizing information that is easy to obtain rather than information that is truly decisive, it can reinforce confirmation bias or status-quo bias.
How to Mitigate It
Mitigating information bias requires focusing on decision-relevant information. A useful question is: "What information could realistically change what we decide or do?" If no plausible result of a test would alter the plan, then the test is likely unnecessary.
Decision protocols can help. For major choices, outline in advance what thresholds or evidence would trigger different actions, and evaluate new data against those criteria. Limiting the number of iterations of analysis and setting deadlines for decisions prevents open-ended data chasing.
In professional contexts, shifting cultural norms so that timely, well-reasoned decisions are valued more than exhaustive but marginal data gathering reduces pressure to over-collect information. Training in basic decision analysis—such as understanding expected value and diagnosticity—also helps.