According to information theorist Claude Shannon, true 'information' is not data but novelty that surprises you and attracts attention. By this definition, a poem is packed with information because of its fresh connections, while a predictable political speech contains almost none.

Related Insights

The "generative" label on AI is misleading. Its true power for daily knowledge work lies not in creating artifacts, but in its superhuman ability to read, comprehend, and synthesize vast amounts of information—a far more frequent and fundamental task than writing.

Economics can be viewed as the physics of information, where profit is the surplus created when intelligent agents organize chaos into useful order (reduce entropy) faster than the system naturally decays back into disorder.

The most effective ideas are not the most outlandish. Human psychology craves both novelty and familiarity simultaneously. Truly successful creative work, from marketing to scientific research, finds the perfect balance between being innovative and being grounded in something the audience already understands.

In the age of AI, the new standard for value is the "GPT Test." If a person's public statements, writing, or ideas could have been generated by a large language model, they will fail to stand out. This places an immense premium on true originality, deep insight, and an authentic voice—the very things AI struggles to replicate.

The pursuit of pure originality is often a status game that leads to incomprehensible ideas. A more effective approach is to see originality as a new way to show people an old, constant truth. This re-frames innovation as a novel form of derivation, making it more accessible and relatable.

Explaining a creative process is inherently a "lossy compression" of the real thing; key nuances are lost in translation. This is why even detailed explanations of a successful process can't be perfectly replicated. The audience then "uncompresses" this partial data into their own interpretation.

The critique that LLMs lack true creativity because they only recombine and predict existing data is challenged by the observation that human creativity, particularly in branding and marketing, often operates on the exact same principles. The process involves combining existing concepts in novel ways to feel fresh, much like an LLM.

Morgan Housel finds that the content that performs best is often basic and seems obvious to the writer. Readers resonate with ideas they already intuitively feel but have never seen articulated. This connection requires less mental bandwidth than processing a completely novel concept, leading to wider sharing.

The human mind rejects ideas that are too novel. Effective communication and innovation should be grounded in the familiar, introducing only about 20% new information. This principle, from designer Raymond Loewy, helps make new concepts intelligible and acceptable.

Google's Titans architecture for LLMs mimics human memory by applying Claude Shannon's information theory. It scans vast data streams and identifies "surprise"—statistically unexpected or rare information relative to its training data. This novel data is then prioritized for long-term memory, preventing clutter from irrelevant information.