When junior employees are encouraged to use AI from day one, they fail to develop foundational skills. This "deskilling" means they won't be able to spot AI hallucinations or errors, ironically making them less competent and more liable, particularly in fields like law.

Related Insights

Professions like law and medicine rely on a pyramid structure where newcomers learn by performing basic tasks. If AI automates this essential junior-level work, the entire model for training and developing senior experts could collapse, creating an unprecedented skills and experience gap at the top.

By automating the rote work historically done by junior lawyers (e.g., discovery, basic contract drafting), AI threatens the profession's apprenticeship model. This 'cognitive de-skilling' may prevent new lawyers from gaining the foundational experience needed to become experts.

AI tools frequently produce incorrect information, with error rates as high as 30%. Relying on this technology to replace entry-level staff is a major risk, as newcomers are essential for learning and eventually providing the human oversight that fallible AI requires.

By replacing the foundational, detail-oriented work of junior analysts, AI prevents them from gaining the hands-on experience needed to build sophisticated mental models. This will lead to a future shortage of senior leaders with the deep judgment that only comes from being "in the weeds."

AI tools are taking over foundational research and drafting, tasks traditionally done by junior associates. This automation disrupts the legal profession's apprenticeship model, raising questions about how future senior lawyers will gain essential hands-on experience and skills.

While AI can augment experienced workers, relying on it to replace newcomers is a mistake. Its significant error rate (20-30%) requires human oversight and judgment that junior employees haven't yet developed, making it an unreliable substitute for on-the-job learning.

While AI can accelerate tasks like writing, the real learning happens during the creative process itself. By outsourcing the 'doing' to AI, we risk losing the ability to think critically and synthesize information. Research shows our brains are physically remapping, reducing our ability to think on our feet.

AI accelerates data retrieval, but it creates a dangerous knowledge gap. Junior employees can find facts (e.g., in a financial statement) without the experience-based judgment to understand their deeper connections and second-order consequences for the business.

While AI "hallucinations" grab headlines, the more systemic risk is lawyers becoming overly reliant on AI and failing to perform due diligence. The LexisNexis CEO predicts an attorney will eventually lose their license not because the AI failed, but because the human failed to properly review the work.

AI is breaking the traditional model where junior employees learn by doing repetitive tasks. As both interns and managers turn to AI, this learning loop is lost. This shift could make formal, structured education more critical for professional skill development in the future.

Over-reliance on AI Risks Creating a Generation of Incompetent Professionals | RiffOn