While it can feel frustrating, mandating that teams use AI tools daily is a "necessary evil." This aggressive approach forces rapid adoption and internal learning, allowing a company to disrupt itself before competitors do. The speed of AI's impact makes this an uncomfortable but critical survival strategy.

Related Insights

Mandating AI usage can backfire by creating a threat. A better approach is to create "safe spaces" for exploration. Atlassian runs "AI builders weeks," blocking off synchronous time for cross-functional teams to tinker together. The celebrated outcome is learning, not a finished product, which removes pressure and encourages genuine experimentation.

To drive adoption, Axios's CEO gave all staff licensed AI access and a simple mandate: spend 10% of your day finding ways it can improve your specific job and share wins. This bottom-up, experimental approach fostered organic adoption and practical use cases more effectively than a top-down directive.

An effective AI strategy pairs a central task force for enablement—handling approvals, compliance, and awareness—with empowerment of frontline staff. The best, most elegant applications of AI will be identified by those doing the day-to-day work.

AI is a 'hands-on revolution,' not a technological shift like the cloud that can be delegated to an IT department. To lead effectively, executives (including non-technical ones) must personally use AI tools. This direct experience is essential for understanding AI's potential and guiding teams through transformation.

The biggest resistance to adopting AI coding tools in large companies isn't security or technical limitations, but the challenge of teaching teams new workflows. Success requires not just providing the tool, but actively training people to change their daily habits to leverage it effectively.

Simply buying an AI tool is insufficient for understanding its potential or deriving value. Leaders feeling behind in AI must actively participate in the deployment process—training the model, handling errors, and iterating daily. Passive ownership and delegation yield zero learning.

Moving beyond casual experimentation with AI requires a cultural mandate for frequent, deep integration. Employees should engage with generative AI tools multiple times every hour to ideate, create, or validate work, treating it as an ever-present collaborator rather than an occasional tool.

Recognizing that providing tools is insufficient, LinkedIn is making "AI agency and fluency" a core part of its performance evaluation and calibration process. This formalizes the expectation that employees must actively use AI tools to succeed, moving adoption from voluntary to a career necessity.

A successful AI transformation isn't just about providing tools. It requires a dual approach: senior leadership must clearly communicate that AI adoption is a strategic priority, while simultaneously empowering individual employees with the tools and autonomy to innovate and transform their own workflows.

To lead in the age of AI, it's not enough to use new tools; you must intentionally disrupt your own effective habits. Force yourself to build, write, and communicate in new ways to truly understand the paradigm shift, even when your old methods still work well.

Mandating AI Tool Adoption, Though Ruthless, Is a Necessary Evil for Self-Disruption | RiffOn