A Washington Post article claimed COVID was "no longer a pandemic of the unvaccinated" because 58% of deaths were vaccinated. This ignored the denominator: 80% of the population was vaccinated, meaning the unvaccinated were actually dying at a much higher rate.
We tend to stop analyzing data once we find a conclusion that feels satisfying. This cognitive shortcut, termed "explanatory satisfaction," is often triggered by confirmation bias or a desire for a simple narrative, preventing us from reaching more accurate, nuanced insights.
We are wired by conversational rules (pragmatics) to assume superfluous information is important. When given irrelevant but descriptive details, we ignore the statistical base rate because we assume the details were provided for a reason, a cognitive quirk that can be exploited.
Instead of giving a single point estimate, provide a forecast with a lower and upper bound. This approach communicates both what you know and what you don't. It reduces the risk of being perceived as "wrong" and invites others to share information that can help narrow the range.
A company found its top engineers were "difficult." Before changing hiring criteria to favor this trait, they checked their worst-performing engineers and found they were also difficult. The trait was common to all engineers, not a signal of success, revealing a classic survivorship bias.
Research from Duncan Watts shows the bigger societal issue isn't fabricated facts (misinformation), but rather taking true data points and drawing misleading conclusions (misinterpretation). This happens 41 times more often and is a more insidious problem for decision-makers.
Instead of teaching decision-making in isolation, education should integrate skills like counterfactual thinking directly into core subjects. Analyzing literature by asking, "What if Macbeth had chosen a different option?" makes the material more engaging and teaches critical thinking simultaneously.
Reporting that hormone therapy caused a "25% increase" in cancer was terrifying relative risk (5 cases vs 4). The absolute risk, however, was a minuscule change (from 4 in 1,000 to 5 in 1,000). Understanding this difference is crucial for making informed health decisions.
Instead of banning AI, educators should teach students how to prompt it effectively to improve their decision-making. This includes forcing it to cite sources, generate counterarguments, and explain its reasoning, turning AI into a tool for critical inquiry rather than just an answer machine.
The world has never been truly deterministic, but slower cycles of change made deterministic thinking a less costly error. Today, the rapid pace of technological and social change means that acting as if the world is predictable gets punished much more quickly and severely.
The phrase "I make my own luck" is a misnomer. Life outcomes are a function of two things: luck (uncontrollable) and decision quality. While you can't control luck, you can consistently make better decisions that increase the probability of favorable outcomes over time.
The common advice to overcome sunk cost fallacy—"imagine you didn't own this, would you buy it today?"—is ineffective because you cannot truly ignore the reality of ownership. A more robust method is setting pre-commitment contracts or "kill criteria" that force a decision when specific signals are observed.
