We tend to stop analyzing data once we find a conclusion that feels satisfying. This cognitive shortcut, termed "explanatory satisfaction," is often triggered by confirmation bias or a desire for a simple narrative, preventing us from reaching more accurate, nuanced insights.
Intelligence is often used as a tool to generate more sophisticated arguments for what one already believes. A higher IQ correlates with the ability to find reasons supporting your stance, not with an enhanced ability to genuinely consider opposing viewpoints.
Citing philosopher Alex O'Connor, the human brain is not optimized for raw data but for narrative. By asking people to abandon myth and story—the things that feel most real—in favor of statistics, the rationalist movement is asking people to fight their own cognitive wiring.
To combat confirmation bias, withhold the final results of an experiment or analysis until the entire team agrees the methodology is sound. This prevents people from subconsciously accepting expected outcomes while overly scrutinizing unexpected ones, leading to more objective conclusions.
Humans naturally conserve mental energy, a concept Princeton's Susan Fisk calls being 'cognitive misers.' For most decisions, people default to quick, intuitive rules of thumb (heuristics) rather than deep, logical analysis. Marketing is more effective when it works with this human nature, not against it.
Humans crave control. When faced with uncertainty, the brain compensates by creating narratives and seeing patterns where none exist. This explains why a conspiracy theory about a planned event can feel more comforting than a random, chaotic one—the former offers an illusion of understandable order.
The human brain resists ambiguity and seeks closure. When a significant, factual event occurs but is followed by a lack of official information (often for legitimate investigative reasons), this creates an "open loop." People will naturally invent narratives to fill that void, giving rise to conspiracy theories.
We are cognitively wired with a "truth bias," causing us to automatically assume that what we see and hear is true. We only engage in skeptical checking later, if at all. Scammers exploit this default state, ensnaring us before our slower, more deliberate thinking can kick in.
When emotionally invested, even seasoned professionals can ignore their own expertise. The speaker, a researcher, sought validation from biased sources like friends instead of conducting objective market research, proving that personal attachment can override professional discipline.
To counteract the brain's tendency to preserve existing conclusions, Charles Darwin deliberately considered evidence that contradicted his hypotheses. He was most rigorous when he felt most confident in an idea—a powerful, counterintuitive method for maintaining objectivity and avoiding confirmation bias.
The brain's tendency to create stories simplifies complex information but creates a powerful confirmation bias. As illustrated by a military example where a friendly tribe was nearly bombed, leaders who get trapped in their narrative will only see evidence that confirms it, ignoring critical data to the contrary.