The experience of building an electric aircraft, described as a "flying robot," is directly transferable to humanoid robotics. Both require deep expertise in integrating batteries, motors, embedded systems, sensors, and control software, creating a natural pathway for talent and knowledge between the two deep-tech fields.
Insiders in top robotics labs are witnessing fundamental breakthroughs. These “signs of life,” while rudimentary now, are clear precursors to a rapid transition from research to widely adopted products, much like AI before ChatGPT’s public release.
The 1X robot's teleoperation, often seen as a sign of immaturity, is actually a key feature. It allows for both a "human-in-the-loop" expert service for complex tasks and personal remote control, like checking on a pet, creating immediate utility beyond full autonomy.
Instead of creating bespoke self-driving kits for every car model, a humanoid robot can physically sit in any driver's seat and operate the controls. This concept, highlighted by George Hotz, bypasses proprietary vehicle systems and hardware lock-in, treating the car as a black box.
Figure is observing that data from one robot performing a task (e.g., moving packages in a warehouse) improves the performance of other robots on completely different tasks (e.g., folding laundry at home). This powerful transfer learning, enabled by deep learning, is a key driver for scaling general-purpose capabilities.
Progress in robotics for household tasks is limited by a scarcity of real-world training data, not mechanical engineering. Companies are now deploying capital-intensive "in-field" teams to collect multi-modal data from inside homes, capturing the complexity of mundane human activities to train more capable robots.
Experience in robotics, where systems often fail, cultivates resilience and a deep focus on analyzing data to debug problems. This "gritty" skill set is highly transferable and valuable in the world of large language models, where perseverance and data intuition are key.
Car companies are uniquely positioned to build humanoid robots. They possess deep expertise in mass manufacturing complex systems with chips and batteries, and they are already heavy users of robotics in their own factories, giving them a significant advantage in the emerging market.
General-purpose robotics lacks standardized interfaces between hardware, data, and AI. This makes a full-stack, in-house approach essential because the definition of 'good' for each component is constantly co-evolving. Partnering is difficult when your standard of quality is a moving target.
Intuition Robotics' core bet is that the transfer from simulated to physical worlds is unlocked by a shared action interface. Since many real-world robots like drones and arms are already operated with game controllers, an agent trained in diverse gaming environments only needs to adapt to a new visual world, not an entirely new action space.
Unlike older robots requiring precise maps and trajectory calculations, new robots use internet-scale common sense and learn motion by mimicking humans or simulations. This combination has “wiped the slate clean” for what is possible in the field.