When school administrators impose top-down mandates for using specific AI systems, it becomes a labor issue. This approach strips teachers of their professional autonomy and control over their work environment, leading to significant demotivation, regardless of the tool's supposed benefits.
Rather than banning AI, one professor engages students by explaining how LLMs work—predicting word order without concern for truth. This frames the tool as a "bullshitter," which is fundamentally incompatible with a historian's duty to factual accuracy, turning a cheating tool into a lesson on epistemology.
Contrary to the sales pitch, AI tools can create more work for educators. The time required to verify facts, fix AI-generated errors, and correct hallucinations in lesson plans or translations often negates any initial time savings, a pattern also observed with software coders.
The belief that children born into a tech-rich world inherently understand how to use digital tools for education is false. Research shows their proficiency with entertainment platforms like YouTube or Roblox does not equip them with the skills needed for actual learning applications, leading to flawed assumptions in the classroom.
Students often use AI not out of laziness, but as a logical coping mechanism for an educational system prioritizing final grades over the learning process. Facing immense pressure from multiple courses and jobs, they see AI as a tool to produce a required "product" and survive, revealing a flaw in the system's incentives.
Contrary to widespread panic, research indicates that the percentage of students who self-report using AI to generate an entire assignment is only 10%. This figure has remained stable for cheating over the years, regardless of technology. Most students use AI to explain concepts or generate ideas, not to plagiarize wholesale.
Relying on generative AI to produce assignments bypasses the effortful cognitive processes—like reflection and structuring arguments—that are essential for forming long-term memories. As a result, students who use tools like ChatGPT have very poor recall of the essays they submitted, defeating the purpose of the learning exercise.
