According to Dropbox's VP of Engineering, the flood of low-quality, AI-generated "work slop" isn't a technology problem, but a strategy problem. When leaders push for AI adoption without defining crisp use cases and goals, employees are left to generate generic content that fails to add real value.
While AI tools once gave creators an edge, they now risk producing democratized, undifferentiated output. IBM's AI VP, who grew to 200k followers, now uses AI less. The new edge is spending more time on unique human thinking and using AI only for initial ideation, not final writing.
To avoid "AI slop"—the proliferation of low-quality AI outputs—Dell's CTO advocates for a disciplined, top-down strategy. Instead of letting tools run wild, they focus on a small number of high-impact use cases with clear business outcomes, ensuring quality and preventing chaos.
The term "slop" is misattributed to AI. It actually describes any generic, undifferentiated output designed for mass appeal, a problem that existed in human-made media long before LLMs. AI is simply a new tool for scaling its creation.
Without a strong foundation in customer problem definition, AI tools simply accelerate bad practices. Teams that habitually jump to solutions without a clear "why" will find themselves building rudderless products at an even faster pace. AI makes foundational product discipline more critical, not less.
Implementing AI tools in a company that lacks a clear product strategy and deep customer knowledge doesn't speed up successful development; it only accelerates aimless activity. True acceleration comes from applying AI to a well-defined direction informed by user understanding.
AI-generated "work slop"—plausible but low-substance content—arises from a lack of specific context. The cure is not just user training but building systems that ingest and index a user's entire work graph, providing the necessary grounding to move from generic drafts to high-signal outputs.
Research highlights "work slop": AI output that appears polished but lacks human context. This forces coworkers to spend significant time fixing it, effectively offloading cognitive labor and damaging perceptions of the sender's capability and trustworthiness.
Teams that become over-reliant on generative AI as a silver bullet are destined to fail. True success comes from teams that remain "maniacally focused" on user and business value, using AI with intent to serve that purpose, not as the purpose itself.
Companies racing to add AI features while ignoring core product principles—like solving a real problem for a defined market—are creating a wave of failed products, dubbed "AI slop" by product coach Teresa Torres.
A new risk for engineering leaders is becoming a 'vibe coding boss': using AI to set direction but misjudging its output as 95% complete when it's only 5%. This burdens the team with cleaning up a 'big mess of slop' rather than accelerating development.