We scan new podcasts and send you the top 5 insights daily.
Our cognitive wiring prefers making harmless errors (false positives, e.g., seeing a predator that isn't there) over fatal ones (false negatives). This "better safe than sorry" principle, as described by Michael Shermer, underlies our susceptibility to misinformation and snap judgments.
Mathematical models of evolution demonstrate a near-zero probability that natural selection would shape sensory systems to perceive objective truth. Instead, our senses evolved merely to guide adaptive behavior, prioritizing actions that lead to survival and reproduction over generating an accurate depiction of the world.
During the COVID pandemic, some people drank bleach because our brains are wired to despise uncertainty. In the absence of clear answers, we gravitate towards any promised solution, however dangerous, because taking action provides a false sense of control.
Michael Shermer suggests that when people latch onto misinformation, it's less about the event's specifics and more a manifestation of a pre-existing tribal belief. The false story simply reinforces a general sentiment, like "I don't trust that group," making the specific facts irrelevant.
Most anxiety feels disproportionate because evolution prioritizes survival. The cost of missing a real threat (a "false negative") is catastrophic (death), while the cost of a "false positive" (needless anxiety) is merely some calories. This makes excessive worry a rational, albeit painful, design known as the "smoke alarm principle."
Propaganda is effective because it leverages a cognitive bias called the "availability heuristic." By repeating a phrase like "weapons of mass destruction," it becomes the most easily recalled information, causing people—even highly educated ones—to subconsciously accept it as true, regardless of countervailing evidence.
Michael Shermer highlights that reason isn't purely for objective truth-seeking. It also evolved to help us persuade others and defend our group's beliefs. Often, our minds act more like lawyers defending a client (our beliefs) than scientists searching for objective reality.
We are cognitively wired with a "truth bias," causing us to automatically assume that what we see and hear is true. We only engage in skeptical checking later, if at all. Scammers exploit this default state, ensnaring us before our slower, more deliberate thinking can kick in.
The human brain is not optimized for changing its mind based on new data, but for winning arguments. This evolutionary trait traps people in their existing frames of reference, preventing them from assessing reality objectively and finding effective solutions.
Humans are biased to overestimate downside and underestimate upside because our ancestors' survival depended on it. The cautious survived, passing on pessimistic genes. In the modern world, where most risks are not fatal, this cognitive bias prevents us from pursuing opportunities where the true upside is in the unknown.
Human brains are optimized to interpret social patterns, which was critical for survival. This social focus makes us inherently poor at perceiving objective physical reality directly. Individuals less sensitive to social cues might possess a cognitive architecture better suited for scientific inquiry.