We scan new podcasts and send you the top 5 insights daily.
The airline industry's practice of sharing "black box" data and granting pilots immunity fosters a culture of learning from mistakes. Corporations can adopt this to encourage transparency and prevent a "blame game" culture when things go wrong.
The airline industry's safety record improved by legally requiring that 'black box' data from crashes be made public for all airlines to learn from. Businesses can adopt this by creating a culture where learnings from failures are systematically shared across the entire organization, not siloed.
To accelerate organizational learning in AI, incentivize the sharing of failures. A Fortune 500 company gives employees redeemable points for sharing use cases, but offers *extra points* for detailing a failed experiment and the resulting lesson. This normalizes failure and prevents others from repeating the same mistakes.
To prevent a culture of blame, Sierra holds public "lessons learned" sessions for any failure, from lost deals to bugs. This frames failure as a collective responsibility of the team, not an individual's fault. The focus is on fixing the underlying system, fostering paranoia about processes, not people.
When a manager reacts to an error by asking for solutions instead of assigning blame, it signals that mistakes are survivable. This psychological safety encourages employees to be truthful and report issues immediately, allowing the organization to solve problems faster and more effectively.
AstroForge's CEO Matt Gialich champions radical transparency, especially after setbacks. When their Odin mission failed, the company published detailed articles explaining exactly what went wrong and how they planned to fix it. This approach builds trust with stakeholders and institutionalizes learning from mistakes.
Instead of stigmatizing failure, LEGO embeds a formal "After Action Review" (AAR) process into its culture, with reviews happening daily at some level. This structured debrief forces teams to analyze why a project failed and apply those specific learnings across the organization to prevent repeat mistakes.
Instead of blaming an individual for a failed initiative, ask what in the process could be improved. This shift removes fear, fosters psychological safety, and encourages team members to take creative risks without fear of personal reprisal.
The team conducts immediate "hot debriefs" for quick learning within a thick-skinned culture focused on improvement, not blame. "Cold debriefs" happen later, allowing emotions to cool for more strategic conversations after a high-pressure event.
Menlo's culture operates on the principle that when mistakes happen, the system is at fault, not the individual. This approach removes fear and blame, encouraging the team to analyze and improve the processes that allowed the error to occur, fostering a culture of continuous improvement.
If an employee makes an error while following your instructions, the instructions are flawed, not the employee. This approach shifts the focus from penalizing individuals to improving systems. It creates a better training process and a psychologically safe culture that values feedback.