The airline industry's safety record improved by legally requiring that 'black box' data from crashes be made public for all airlines to learn from. Businesses can adopt this by creating a culture where learnings from failures are systematically shared across the entire organization, not siloed.
When a billion-dollar drug trial fails, society learns nothing from the operational process. The detailed documentation of regulatory interactions, manufacturing, and trial design—the "lab notes" of clinical development—is locked away as a trade secret and effectively destroyed, preventing collective industry learning.
The most valuable lessons in clinical trial design come from understanding what went wrong. By analyzing the protocols of failed studies, researchers can identify hidden biases, flawed methodologies, and uncontrolled variables, learning precisely what to avoid in their own work.
Much like a failed surgery provides crucial data for a future successful one, business failures should be seen as necessary steps toward a breakthrough. A "scar" from a failed project is evidence of progress and learning, not something to be hidden. This mindset is foundational for psychological safety.
Traditional risk registers are performative theater. Use a 'Learning Board' with three columns: 'Assumption,' 'Test,' and 'What We Learned.' This reframes risk management as a continuous discovery process and serves as a transparent communication tool for stakeholders, replacing bureaucratic documentation.
Instead of stigmatizing failure, LEGO embeds a formal "After Action Review" (AAR) process into its culture, with reviews happening daily at some level. This structured debrief forces teams to analyze why a project failed and apply those specific learnings across the organization to prevent repeat mistakes.
Intuition is not a mystical gut feeling but rapid pattern recognition based on experience. Since leaders cannot "watch game tape," they must build this mental library by systematically discussing failures and setbacks. This process of embedding learnings sharpens their ability to recognize patterns in future situations.
To avoid repeating errors during rapid growth, HubSpot used a 'Pothole Report.' This process involved a post-mortem on every significant mistake, asking how it could have been handled or what data was needed a year ago to prevent it, effectively institutionalizing learning from failure and promoting proactive thinking.
A sophisticated learning culture avoids the generic 'fail fast' mantra by distinguishing four mistake types. 'Stretch' mistakes are good and occur when pushing limits. 'High-stakes' mistakes are bad and must be avoided. 'Sloppy' mistakes reveal system flaws. 'Aha-moment' mistakes provide deep insights. This framework allows for a nuanced, situation-appropriate response to error.
Menlo's culture operates on the principle that when mistakes happen, the system is at fault, not the individual. This approach removes fear and blame, encouraging the team to analyze and improve the processes that allowed the error to occur, fostering a culture of continuous improvement.
Treat accountability as an engineering problem. Implement a system that logs every significant AI action, decision path, and triggering input. This creates an auditable, attributable record, ensuring that in the event of an incident, the 'why' can be traced without ambiguity, much like a flight recorder after a crash.