The military's career path rewards generalist experience, effectively punishing officers who specialize in critical fields like AI and cyber. Talented specialists are forced to abandon their expertise to get promoted, leading many to leave the service not for money, but to continue doing the work they excel at.
Instead of choosing a career based on its perceived "safety" from AI, individuals should pursue their passions to quickly become domain experts. AI tools augment this expertise, increasing the value of experienced professionals who can handle complex, nuanced situations that AI cannot.
To move beyond general knowledge, AI firms are creating a new role: the "AI Trainer." These are not contractors but full-time employees, typically PhDs with deep domain expertise and a computer science interest, tasked with systematically improving model competence in specific fields like physics or mathematics.
During tech gold rushes like AI, the most skilled engineers ("level 100 players") are drawn to lucrative but less impactful ventures. This creates a significant opportunity cost, as their talents are diverted from society's most pressing challenges, like semiconductor fabrication.
By replacing the foundational, detail-oriented work of junior analysts, AI prevents them from gaining the hands-on experience needed to build sophisticated mental models. This will lead to a future shortage of senior leaders with the deep judgment that only comes from being "in the weeds."
Bureaucracies, like AI models, have pre-programmed "weights" that shape decisions. The DoD is weighted toward its established branches (Army, Navy, etc.). Without a dedicated Cyber Force, cybersecurity is consistently de-prioritized in budgets, promotions, and strategic focus, a vulnerability that AI will amplify.
Tying SDR promotions to time-in-seat fosters stagnation. Instead, create a clear, multi-level roadmap where advancement is based solely on hitting performance thresholds. This model rewards high-achievers, provides constant motivation, and gives reps control over their career trajectory.
Product managers at large AI labs are incentivized to ship safe, incremental features rather than risky, opinionated products. This structural aversion to risk creates a permanent market opportunity for startups to build bold, niche applications that incumbents are organizationally unable to pursue.
Experience alone no longer determines engineering productivity. An engineer's value is now a function of their experience plus their fluency with AI tools. Experienced coders who haven't adapted are now less valuable than AI-native recent graduates, who are in high demand.
The frenzied competition for the few thousand elite AI scientists has created a culture of constant job-hopping for higher pay, akin to a sports transfer season. This instability is slowing down major scientific progress, as significant breakthroughs require dedicated teams working together for extended periods, a rarity in the current environment.
A Meta engineer was denied a promotion despite a "Greatly Exceeds" rating due to a behavioral gap in cross-functional collaboration. This shows that lagging promotions hinge on consistently demonstrating the behaviors of the next level, not just delivering high impact at the current level.