We scan new podcasts and send you the top 5 insights daily.
Coined by Michael Crichton, this cognitive bias describes how we spot glaring errors in news articles about our own profession, only to flip the page and accept reporting on other subjects as credible. This explains why trust in flawed institutions persists, as we compartmentalize our skepticism.
The ability to label a deepfake as 'fake' doesn't solve the problem. The greater danger is 'frequency bias,' where repeated exposure to a false message forms a strong mental association, making the idea stick even when it's consciously rejected as untrue.
We live in "communities of knowledge" where expertise is distributed. Simply being part of a group where others understand a topic (e.g., politics, technology) creates an inflated sense that we personally understand it, contributing to the illusion of individual knowledge.
Pundits who were correct about past tech bubbles (like crypto) are now making confidently wrong predictions about AI. This "Gell-Mann Amnesia" effect, where expertise doesn't transfer between domains, creates confusing paradoxes and forces readers to question the credibility of sources opining outside their core expertise.
Our cognitive wiring prefers making harmless errors (false positives, e.g., seeing a predator that isn't there) over fatal ones (false negatives). This "better safe than sorry" principle, as described by Michael Shermer, underlies our susceptibility to misinformation and snap judgments.
We don't form beliefs based on neutral evidence. Instead, our existing identity acts as a filter that shapes how we interpret neutral events, creating new 'evidence' that reinforces our pre-existing beliefs, whether positive or negative.
Propaganda is effective because it leverages a cognitive bias called the "availability heuristic." By repeating a phrase like "weapons of mass destruction," it becomes the most easily recalled information, causing people—even highly educated ones—to subconsciously accept it as true, regardless of countervailing evidence.
We are cognitively wired with a "truth bias," causing us to automatically assume that what we see and hear is true. We only engage in skeptical checking later, if at all. Scammers exploit this default state, ensnaring us before our slower, more deliberate thinking can kick in.
When emotionally invested, even seasoned professionals can ignore their own expertise. The speaker, a researcher, sought validation from biased sources like friends instead of conducting objective market research, proving that personal attachment can override professional discipline.
While it's wise to question motives, the message has been over-emphasized. This has led to counterproductive cynicism where people distrust all experts and data, believing "it's all fake news" and relying solely on gut feelings instead of evidence.
A key reason biases persist is the 'bias blind spot': the tendency to recognize cognitive errors in others while failing to see them in ourselves. This overconfidence prevents individuals from adopting helpful decision-making tools or choice architecture, as they instinctively believe 'that's them, not me.'