Many on the Titanic delayed evacuating because its nearly identical sister ship, the Olympic, had survived a similar hull puncture months earlier. This past success created a false sense of security and normalcy bias, leading people to underestimate the immediate danger.

Related Insights

Gladwell argues that when systems like vaccination are highly effective, people feel safe enough to entertain crackpot theories. The success of these systems removes the immediate, tangible stakes, creating a 'moral hazard' that permits intellectual laziness and fantastical thinking.

Policymakers instinctively rely on historical analogies. While powerful, this reliance is dangerous when based on simplistic or false comparisons like 'another Munich' or 'another Vietnam.' This makes rigorous, nuanced historical perspective essential to avoid repeating past mistakes driven by flawed parallels.

Instead of relying on population averages for risk (e.g., car accidents), monitor your own close calls and mistakes. These 'near misses' are latent data points that provide a much better personal estimate of your true risk profile and how long you can last before a critical failure occurs if habits don't change.

While the Britannic was sinking, survivor Violet Jessup risked returning for a toothbrush. This seemingly irrational act, prompted by a minor regret from her Titanic survival, provided a small point of control and normalcy amidst extreme chaos, demonstrating a powerful human coping mechanism.

People justify high-risk strategies by retroactively fitting themselves into a successful subgroup (e.g., 'Yes, most investors fail, but *smart* ones succeed, and I am smart'). This is 'hindsight gerrymandering'—using a trait like 'smartness,' which can only be proven after the fact, to create a biased sample and rationalize the risk.

Distinguish between everyday impulses (often unreliable) and true intuition, which becomes a powerful survival guide during genuine crises. Our hardwired survival mechanisms provide clarity when stakes are highest, a state difficult to replicate in non-crisis situations.

Citing Nassim Taleb, a strategy involving many small losses can appear foolish until a single, massive success. This one event rewrites the entire narrative, validating what was previously seen as delusional. History is rewritten by one good day.

Experienced pilots crashed a perfectly flyable plane because overwhelming alarms caused their executive function to collapse. They fixated on one wrong idea, ignoring contradictory data—a stark warning for investors in volatile markets.

The HMHS Britannic was designed to be safer than the Titanic, but it sank faster. A minor rule violation—leaving portholes open for ventilation against standing orders—allowed water to bypass the watertight compartments after a mine strike, causing a cascading failure that doomed the ship.

This concept, 'prevalence-induced concept change,' shows that as significant problems decrease, our brains don't experience fewer issues. Instead, we expand our definition of a 'problem' to include minor inconveniences, making neutral situations seem threatening. This explains why comfort can paradoxically increase perceived hardship.