According to Salesforce's AI chief, the primary challenge for large companies deploying AI is harmonizing data across siloed departments, like sales and marketing. AI cannot operate effectively without connected, unified data, making data integration the crucial first step before any advanced AI implementation.

Related Insights

Despite promises of a single source of truth, modern data platforms like Snowflake are often deployed for specific departments (e.g., marketing, finance), creating larger, more entrenched silos. This decentralization paradox persists because different business functions like analytics and operations require purpose-built data repositories, preventing true enterprise-wide consolidation.

AI's most significant impact is not just campaign optimization but its ability to break down data silos. By combining loyalty, e-commerce, and in-store interaction data, retailers can create a holistic customer view, enabling truly adaptive and intelligent marketing across all channels.

Fragmented data and disconnected systems in traditional marketing clouds prevent AI from forming a complete, persistent memory of customer interactions. This leads to missed opportunities and flawed personalization, as the AI operates with incomplete information, exposing foundational cracks in legacy architecture.

The conventional wisdom that enterprises are blocked by a lack of clean, accessible data is wrong. The true bottleneck is people and change management. Scrappy teams can derive significant value from existing, imperfect internal and public data; the real challenge is organizational inertia and process redesign.

Marketing leaders pressured to adopt AI are discovering the primary obstacle isn't the technology, but their own internal data infrastructure. Siloed, inconsistently structured data across teams prevents them from effectively leveraging AI for consumer insights and business growth.

Many leaders mistakenly halt AI adoption while waiting for perfect data governance. This is a strategic error. Organizations should immediately identify and implement the hundreds of high-value generative AI use cases that require no access to proprietary data, creating immediate wins while larger data initiatives continue.

The core problem for many small and mid-market businesses isn't a lack of software, but an excess of it, using 7 to 25 different apps. This creates massive data fragmentation. The crucial first step isn't buying more tools, but unifying existing data into a single customer profile to enable smarter, automated marketing.

For years, access to compute was the primary bottleneck in AI development. Now, as public web data is largely exhausted, the limiting factor is access to high-quality, proprietary data from enterprises and human experts. This shifts the focus from building massive infrastructure to forming data partnerships and expertise.

The primary reason multi-million dollar AI initiatives stall or fail is not the sophistication of the models, but the underlying data layer. Traditional data infrastructure creates delays in moving and duplicating information, preventing the real-time, comprehensive data access required for AI to deliver business value. The focus on algorithms misses this foundational roadblock.

To maximize AI's impact, don't just find isolated use cases for content or demand gen teams. Instead, map a core process like a campaign workflow and apply AI to augment each stage, from strategy and creation to localization and measurement. AI is workflow-native, not function-native.