To avoid inconsistent or 'vibe coded' documentation, Atlassian's design system team built scripts that crawl their front-end monorepo. These scripts automatically generate structured guideline files for AI consumption by extracting component definitions, types, and usage examples directly from the production source code.

Related Insights

At Perplexity, the design system lives in the codebase, not Figma. Designers contribute directly to the frontend, creating a single source of truth that eliminates drift between design files and production code, forcing a highly practical and collaborative process.

For adding smaller, self-contained components like a chatbox or dark mode toggle, Atlassian created 'recipes.' These are pre-packaged code snippets with instructions that users can paste into an existing prototype. This modular approach avoids needing to start from a full template for minor additions, improving workflow flexibility.

AI prototyping shifts the purpose of a design system from a human-centric resource, reinforced through culture and reviews, to a machine-readable memory bank. The primary function becomes documenting rules and components in a way that provides a persistent, queryable knowledge base for an AI agent to access at all times.

Atlassian improved AI accuracy by instructing it to first think in a familiar framework like Tailwind CSS, then providing a translation map to their proprietary design system components. This bridges the gap between the AI's training data and the company's unique UI language, reducing component hallucinations.

Instead of using siloed note-taking apps, structure all your knowledge—code, writing, proposals, notes—into a single GitHub monorepo. This creates a unified, context-rich environment that any AI coding assistant can access. This approach avoids vendor lock-in and provides the AI with a comprehensive "second brain" to work from.

Moving PRDs and other product artifacts from Confluence or Notion directly into the codebase's repository gives AI coding assistants persistent, local context. This adjacency means the AI doesn't need external tool access (like an MCP) to understand the 'why' behind the code, leading to better suggestions and iterations.

Move beyond basic AI prototyping by exporting your design system into a machine-readable format like JSON. By feeding this into an AI agent, you can generate high-fidelity, on-brand components and code that engineers can use directly, dramatically accelerating the path from idea to implementation.

To get consistent, high-quality results from AI coding assistants, define reusable instructions in dedicated files (e.g., `prd.md`) within your repository. This "agent briefing" file can be referenced in prompts, ensuring all generated assets adhere to a predefined structure and style.

Documentation is shifting from a passive reference for humans to an active, queryable context for AI agents. Well-structured docs on internal APIs and class hierarchies become crucial for agent performance, reducing inefficient and slow context window stuffing for faster code generation.

Instead of generating UIs from scratch, Atlassian provides AI tools with a pre-coded template containing complex elements like navigation. The AI is much better at modifying existing code than creating complex layouts from nothing, reducing the error rate for navigation elements from 50% to nearly zero.