Get your free personalized podcast brief

We scan new podcasts and send you the top 5 insights daily.

AI can easily generate content that satisfies process requirements but lacks real value ("work slop"). This is less of a problem in outcome-focused cultures where work is measured against customer-centric KPIs, not in process-driven ones that just reward completing tasks.

Related Insights

AI makes generating high volumes of content easy, but this introduces "work slop" where quantity overwhelms quality. The new organizational challenge isn't production but sifting through excessive, low-value output. This shifts the most important work from creation to curation and judgment.

If applying GenAI to a process doesn't improve key metrics like revenue or cost, it's a sign that the original human task was likely low-value or "BS work." The AI exposes work that doesn't contribute to business outcomes, prompting re-evaluation of its necessity.

To avoid "AI slop"—the proliferation of low-quality AI outputs—Dell's CTO advocates for a disciplined, top-down strategy. Instead of letting tools run wild, they focus on a small number of high-impact use cases with clear business outcomes, ensuring quality and preventing chaos.

As AI handles more routine tasks, traditional productivity metrics like 'tasks completed' become obsolete. The focus must shift from output to outcomes. It no longer matters what was done on a given day, but rather how tools were used to achieve a specific business goal.

The primary issue with low-effort AI-generated work is not its poor quality, but how it transfers the cognitive burden of correction and completion to the recipient. This 'masquerades' as finished work but creates interpersonal friction and hidden rework, fundamentally shifting the responsibility for the task's success.

Employees produce low-quality AI work not because they are lazy, but as a symptom of a leadership problem. The combination of generalized mandates to use AI and increased workload expectations creates a perfect storm for 'work slop' as a survival mechanism, rather than a productivity tool.

AI is increasingly used to produce low-quality outputs like emails and reports, termed "work slop." While quick to create, this content is often so vague or useless that it makes colleagues' jobs harder, increasing overall administrative burden and hindering real progress.

Research highlights "work slop": AI output that appears polished but lacks human context. This forces coworkers to spend significant time fixing it, effectively offloading cognitive labor and damaging perceptions of the sender's capability and trustworthiness.

Before surveying employees or analyzing output, leaders can diagnose a high risk of 'AI work slop' with a simple test: is AI use mandated? If the organizational strategy is one of mandates, it creates pressure that makes employees far more likely to produce low-quality, box-ticking AI work.

According to Dropbox's VP of Engineering, the flood of low-quality, AI-generated "work slop" isn't a technology problem, but a strategy problem. When leaders push for AI adoption without defining crisp use cases and goals, employees are left to generate generic content that fails to add real value.