The handoff between AI generation and manual refinement is a major friction point. Tools like Subframe solve this by allowing users to seamlessly switch between an 'Ask AI' mode for generative tasks and a 'Design' mode for manual, Figma-like adjustments on the same canvas.

Related Insights

Standard AI coding tools force a linear A-to-B iteration process, which stifles the divergent thinking essential for design exploration. Tools with a 'canvas' feature allow designers to visualize, track, and branch off multiple design paths simultaneously, better mirroring the creative process.

AI-powered "vibe coding" is reversing the design workflow. Instead of starting in Figma, designers now build functional prototypes directly with code-generating tools. Figma has shifted from being the first step (exploration) to the last step (fine-tuning the final 20% of pixel-perfect details).

Complex AI-generated assets like slide decks are often not directly editable. The new creative workflow is not about manual tweaks but about refining prompts and regenerating the output. Mastery of this iterative process is becoming a critical skill for creative professionals.

When iterating on a Gemini 3.0-generated app, the host uses the annotation feature to draw directly on the preview to request changes. This visual feedback loop allows for more precise and context-specific design adjustments compared to relying solely on ambiguous text descriptions.

Most generative AI tools get users 80% of the way to their goal, but refining the final 20% is difficult without starting over. The key innovation of tools like AI video animator Waffer is allowing iterative, precise edits via text commands (e.g., "zoom in at 1.5 seconds"). This level of control is the next major step for creative AI tools.

While AI tools excel at generating initial drafts of code or designs, their editing capabilities are poor. The difficulty of making specific changes often forces creators to discard the AI output and start over, as editing is where the "magic" breaks down.

As AI models become proficient at generating high-quality UI from prompts, the value of manual design execution will diminish. A professional designer's key differentiator will become their ability to build the underlying, unique component libraries and design systems that AI will use to create those UIs.

Leverage AI as an idea generator rather than a final execution tool. By prompting for multiple "vastly different" options—like hover effects—you can review a range of possibilities, select a promising direction, and then iterate, effectively using AI to explore your own taste.

AI often generates several good ideas across multiple prototypes. Instead of recreating them manually, use a tool like Subframe that allows you to directly drag and drop components from one AI-generated variant into another. This 'kitbashing' approach accelerates the creation of a polished design.

AI tools can drastically increase the volume of initial creative explorations, moving from 3 directions to 10 or more. The designer's role then shifts from pure creation to expert curation, using their taste to edit AI outputs into winning concepts.