A huge portion of product development involves creating user interfaces for backend databases. AI-powered inference engines will allow users to state complex goals in natural language, bypassing the need for traditional UIs and fundamentally changing software development.
Figma CEO Dylan Field predicts we will look back at current text prompting for AI as a primitive, command-line interface, similar to MS-DOS. The next major opportunity is to create intuitive, use-case-specific interfaces—like a compass for AI's latent space—that allow for more precise control beyond text.
AI is becoming the new UI, allowing users to generate bespoke interfaces for specific workflows on the fly. This fundamentally threatens the core value proposition of many SaaS companies, which is essentially selling a complex UX built on a database. The entire ecosystem will need to adapt.
The best agentic UX isn't a generic chat overlay. Instead, identify where users struggle with complex inputs like formulas or code. Replace these friction points with a native, natural language interface that directly integrates the AI into the core product workflow, making it feel seamless and powerful.
AI will fundamentally change user interfaces. Instead of designers pre-building UIs, AI will generate the necessary "forms and lists" on the fly based on a user's natural language request. This means for the first time, the user, not the developer, will be the one creating the interface.
AI development has evolved to where models can be directed using human-like language. Instead of complex prompt engineering or fine-tuning, developers can provide instructions, documentation, and context in plain English to guide the AI's behavior, democratizing access to sophisticated outcomes.
As AI models become proficient at generating high-quality UI from prompts, the value of manual design execution will diminish. A professional designer's key differentiator will become their ability to build the underlying, unique component libraries and design systems that AI will use to create those UIs.
The next user interface paradigm is delegation, not direct manipulation. Humans will communicate with AI agents via voice, instructing them to perform complex tasks on computers. This will shift daily work from hours of clicking and typing to zero, fundamentally changing our relationship with technology.
With AI, designers are no longer just guessing user intent to build static interfaces. Their new primary role is to facilitate the interaction between a user and the AI model, helping users communicate their intent, understand the model's response, and build a trusted relationship with the system.
Chatbots are fundamentally linear, which is ill-suited for complex tasks like planning a trip. The next generation of AI products will use AI as a co-creation tool within a more flexible canvas-like interface, allowing users to manipulate and organize AI-generated content non-linearly.
AI tools that generate functional UIs from prompts are eliminating the 'language barrier' between marketing, design, and engineering teams. Marketers can now create visual prototypes of what they want instead of writing ambiguous text-based briefs, ensuring alignment and drastically reducing development cycles.