The impulse to "add AI" is common, but workshops exploring it must first ask "where do we have good, clean data?". Without a solid data foundation, AI ideation is futile. The first innovation step might be improving data collection, not implementing machine learning.

Related Insights

Many teams wrongly focus on the latest models and frameworks. True improvement comes from classic product development: talking to users, preparing better data, optimizing workflows, and writing better prompts.

For AI products, the quality of the model's response is paramount. Before building a full feature (MVP), first validate that you can achieve a 'Minimum Viable Output' (MVO). If the core AI output isn't reliable and desirable, don't waste time productizing the feature around it.

The impulse to make all historical data "AI-ready" is a trap that can take years and millions of dollars for little immediate return. A more effective approach is to identify key strategic business goals, determine the specific data needed, and focus data preparation efforts there to achieve faster impact and quick wins.

Before implementing AI, organizations must first build a unified data platform. Many companies have multiple, inconsistent "data lakes" and lack basic definitions for concepts like "customer" or "transaction." Without this foundational data consolidation, any attempt to derive insights with AI is doomed to fail due to semantic mismatches.

Many leaders mistakenly halt AI adoption while waiting for perfect data governance. This is a strategic error. Organizations should immediately identify and implement the hundreds of high-value generative AI use cases that require no access to proprietary data, creating immediate wins while larger data initiatives continue.

People overestimate AI's 'out-of-the-box' capability. Successful AI products require extensive work on data pipelines, context tuning, and continuous model training based on output. It's not a plug-and-play solution that magically produces correct responses.

Before deploying AI across a business, companies must first harmonize data definitions, especially after mergers. When different units call a "raw lead" something different, AI models cannot function reliably. This foundational data work is a critical prerequisite for moving beyond proofs-of-concept to scalable AI solutions.

Before any AI is built, deep workflow discovery is critical. This involves partnering with subject matter experts to map cross-functional processes, data flows, and user needs. AI currently cannot uncover these essential nuances on its own, making this human-centric step non-negotiable for success.

Companies with messy data should focus on generative AI tasks like content creation for immediate value. Predictive AI projects, such as churn forecasting, require extensive data cleaning and expertise, making them slow and complex. Generative tools offer quick efficiency gains with minimal setup, providing a faster path to ROI.

Instead of being swayed by new AI tools, business owners should first analyze their own processes to find inefficiencies. This allows them to select a specific tool that solves a real problem, thereby avoiding added complexity and ensuring a genuine return on investment.