Kevin Rose describes discovering he has aphantasia, a condition where one cannot voluntarily visualize mental images. For these individuals, abstract concepts and memories are experienced through feelings and kinesthetics rather than vivid pictures, highlighting vast, often unknown, differences in human cognition.
Our perception of sensing then reacting is an illusion. The brain constantly predicts the next moment based on past experiences, preparing actions before sensory information fully arrives. This predictive process is far more efficient than constantly reacting to the world from scratch, meaning we act first, then sense.
Trauma is not an objective property of an event but a subjective experience created by the relationship between a present situation and past memories. Because experience is a combination of sensory input and remembered past, changing the meaning or narrative of past events can change the experience of trauma itself.
Memory doesn't work like a linear filing system. It's stored in associative patterns based on themes and emotions. When one memory is activated, it can trigger a cascade of thematically connected memories, regardless of when they occurred, explaining why a current event can surface multiple similar past experiences.
Contrary to popular belief, intuition isn't just a "gut feeling" or brain pattern. Research, particularly from trauma studies like "The Body Keeps the Score," shows that wisdom and life patterns are physically embedded in the body's fascia and musculature.
Scientific literature suggests humans have between 22 and 33+ physiological senses, including balance, proprioception, and awareness of internal states like bladder fullness. This reframes human potential, suggesting we are capable of perceiving far more than we commonly acknowledge.
Designing for users with motor disabilities who control interfaces with their minds presents a unique challenge. Unlike typical design scenarios, it's impossible for designers to truly imagine or simulate the sensory experience, making direct empathy an unreliable tool for closed-loop interactions.
Vision, a product of 540 million years of evolution, is a highly complex process. However, because it's an innate, effortless ability for humans, we undervalue its difficulty compared to language, which requires conscious effort to learn. This bias impacts how we approach building AI systems.
Labels like 'imposter syndrome' or 'feeling like a failure' are purely mental stories, not physical realities. Your body doesn't know what 'failure' is; it only experiences sensations like a churning stomach or tightness in the chest. By focusing on the raw physical feeling, you disconnect from the mind's debilitating narrative.
People watched the movie 'Contagion' during the pandemic rather than reading scientific papers because the human brain is wired to learn through first-person stories, not lists of facts. Narratives provide a simulated, experiential perspective that taps into ancient brain mechanisms, making the information more memorable, understandable, and emotionally resonant.
Our brains process natural scenes with high 'fluency,' compressing a complex view like a tree with thousands of leaves into a single, simple concept. In contrast, urban scenes often require us to mentally catalog distinct objects (cars, signs, buildings), creating a higher cognitive load and contributing to mental fatigue.