We scan new podcasts and send you the top 5 insights daily.
The FAA has a program where pilots can self-report safety issues without fear of prosecution (for non-criminal acts). This encourages transparency and vital data collection, making the entire industry safer by systematically learning from individual mistakes and near-misses.
The airline industry's safety record improved by legally requiring that 'black box' data from crashes be made public for all airlines to learn from. Businesses can adopt this by creating a culture where learnings from failures are systematically shared across the entire organization, not siloed.
To accelerate organizational learning in AI, incentivize the sharing of failures. A Fortune 500 company gives employees redeemable points for sharing use cases, but offers *extra points* for detailing a failed experiment and the resulting lesson. This normalizes failure and prevents others from repeating the same mistakes.
Companies can surface honest feedback on major projects by creating anonymous, internal prediction markets. This allows employees to share crucial 'inside information' about potential delays or failures without fear of reprisal from leadership that only wants to hear good news.
When a manager reacts to an error by asking for solutions instead of assigning blame, it signals that mistakes are survivable. This psychological safety encourages employees to be truthful and report issues immediately, allowing the organization to solve problems faster and more effectively.
A 'blame and shame' culture develops when all bad outcomes are punished equally, chilling employee reporting. To foster psychological safety, leaders must distinguish between unintentional mistakes (errors) and conscious violations (choices). A just response to each builds a culture where people feel safe admitting failures.
The airline industry's practice of sharing "black box" data and granting pilots immunity fosters a culture of learning from mistakes. Corporations can adopt this to encourage transparency and prevent a "blame game" culture when things go wrong.
Instead of blaming individuals for errors, leaders should analyze the systemic conditions that led to the mistake. Error isn't random; it's a patterned outcome. This shifts the focus from 'fixing people' to designing more resilient systems.
AstroForge's CEO Matt Gialich champions radical transparency, especially after setbacks. When their Odin mission failed, the company published detailed articles explaining exactly what went wrong and how they planned to fix it. This approach builds trust with stakeholders and institutionalizes learning from mistakes.
Instead of relying on passive whistleblower hotlines, companies can proactively identify high-risk areas. A simple survey asking employees if they've seen misconduct, if they reported it, and why not, acts as a powerful diagnostic tool to pinpoint where integrity gaps are emerging before they become major crises.
Treat accountability as an engineering problem. Implement a system that logs every significant AI action, decision path, and triggering input. This creates an auditable, attributable record, ensuring that in the event of an incident, the 'why' can be traced without ambiguity, much like a flight recorder after a crash.