AI enables rapid book creation by generating chapters and citing sources. This creates a new problem: authors can produce works on complex topics without ever reading the source material or developing deep understanding. This "AI slop" presents a veneer of expertise that lacks the genuine, ingested knowledge of its human creator.

Related Insights

The problem with bad AI-generated work ('slop') isn't just poor writing. It's that subtle inaccuracies or context loss can derail meetings and create long, energy-wasting debates. This cognitive overload makes it difficult for teams to sense-make and ultimately costs more in human time than it saves.

While AI tools once gave creators an edge, they now risk producing democratized, undifferentiated output. IBM's AI VP, who grew to 200k followers, now uses AI less. The new edge is spending more time on unique human thinking and using AI only for initial ideation, not final writing.

The "generative" label on AI is misleading. Its true power for daily knowledge work lies not in creating artifacts, but in its superhuman ability to read, comprehend, and synthesize vast amounts of information—a far more frequent and fundamental task than writing.

The internet's value stems from an economy of unique human creations. AI-generated content, or "slop," replaces this with low-quality, soulless output, breaking the internet's economic engine. This trend now appears in VC pitches, with founders presenting AI-generated ideas they don't truly understand.

While cheating is a concern, a more insidious danger is students using AI to bypass deep cognitive engagement. They can produce correct answers without retaining knowledge, creating a cumulative learning deficit that is difficult to detect and remedy.

Research highlights "work slop": AI output that appears polished but lacks human context. This forces coworkers to spend significant time fixing it, effectively offloading cognitive labor and damaging perceptions of the sender's capability and trustworthiness.

Identify an expert who hasn't written a book on a specific topic. Train an AI on their entire public corpus of interviews, podcasts, and articles. Then, prompt it to structure and synthesize that knowledge into the book they might have written, complete with their unique frameworks and quotes.

Don't use AI to generate generic thought leadership, which often just regurgitates existing content. The real power is using AI as a 'steroid' for your own ideas. Architect the core content yourself, then use AI to turbocharge research and data integration to make it 10x better.

Professionals are using AI to write detailed reports, while their managers use AI to summarize them. This creates a feedback loop where AI generates content for other AIs to consume, with humans acting merely as conduits. This "AI slop" replaces deep thought with inefficient, automated communication.

The ease of generating AI summaries is creating low-quality 'slop.' This imposes a hidden productivity cost, as collaborators must waste time clarifying ambiguous or incorrect AI-generated points, derailing work and leading to lengthy, unnecessary corrections.