The AI Classroom Dilemma: How Schools Can Use Generative Tools Without Losing Learning
Generative AI is now widely used by students, often improving speed and performance but weakening deep learning when it replaces thinking rather than supporting it. The OECD warns that only education-specific, teacher-guided uses of AI can turn it from a shortcut into a genuine learning partner.
When tools like ChatGPT became publicly available, education systems around the world were caught off guard. Students adopted generative artificial intelligence almost overnight, often without guidance from teachers or institutions. A major new report by the OECD's Centre for Educational Research and Innovation, drawing on research from institutions such as Stanford University, Harvard University, the Wharton School, the University of Oxford and ETH Zurich, shows that generative AI is now firmly embedded in learning. The question is no longer whether students will use it, but whether schools can shape how it is used so that learning is strengthened rather than weakened.
Across many countries, most upper-secondary and university students now rely on AI tools to summarise readings, generate ideas, draft assignments, and answer questions instantly. Teachers, meanwhile, tend to use AI more cautiously, mainly for lesson planning or administrative work. This growing gap between student use and teaching practice has created a mismatch that education systems are struggling to resolve.
When Better Performance Does Not Mean Better Learning
One of the report's most striking findings is that generative AI can improve how students perform on tasks without improving what they actually learn. In several controlled studies, students using general-purpose AI tools produced higher-quality work during practice sessions but performed worse in exams once access to AI was removed. Research from classrooms in Türkiye, the United States and China shows the same pattern: AI can boost short-term results while weakening long-term understanding.
Scientists describe this as cognitive offloading. Instead of thinking through problems, checking ideas and learning from mistakes, students may accept AI-generated answers too quickly. Brain studies cited in the report show that students who rely on AI early in a task have lower recall and weaker engagement with the material. The risk is not traditional cheating, but the quiet loss of the mental effort that leads to real learning.
Why Some AI Tools Help Students Learn Better
The OECD is careful to stress that generative AI is not harmful by nature. Its impact depends on how it is designed and used. When AI tools are built specifically for education and guided by learning science, results improve. Educational AI tutors that use questioning, feedback and step-by-step guidance, rather than simply giving answers, have helped students learn more effectively.
In university physics courses, students taught by AI tutors designed around active learning principles learned more in less time than those in traditional classes. They were also more motivated and engaged. The key difference was not the technology itself, but the pedagogy behind it. AI worked best when it supported thinking instead of replacing it.
Teachers Remain the Most Important Factor
Teachers play a decisive role in whether generative AI helps or harms learning. The report argues against replacing teachers with AI and instead promotes a model where AI supports professional judgement. Used well, AI can help teachers save time, improve lesson quality and personalise support for students.
However, there are risks. Over-automation of tasks such as grading, feedback, or lesson design could weaken teachers' skills and reduce meaningful interaction with students. The OECD recommends an "augmentation" approach, where teachers and AI work together, reviewing and improving each other's outputs. In this model, technology amplifies human expertise rather than sidelining it.
A Choice Between Shortcuts and Learning Partners
Generative AI is also reshaping education systems beyond the classroom. Universities are using it to analyse curricula, predict student workload, streamline admissions and improve career guidance. AI can generate assessment questions quickly and support more realistic writing and speaking tasks. Still, the report is clear that high-stakes decisions must remain under human control. Students tend to trust and respond to human feedback more than AI feedback, even when the technical quality is similar.
The report ends with a clear warning. Banning AI is unlikely to work in a world where powerful tools are freely available. The real challenge for policymakers is to ensure that generative AI becomes a learning partner, not a learning shortcut. Without careful design, governance and AI literacy, education risks trading deep understanding for speed and convenience. With the right choices, however, generative AI could help build more effective, fair and human-centred education systems.
- FIRST PUBLISHED IN:
- Devdiscourse
ALSO READ
-
Can a Machine Lead? SBS University’s 17th National Debate Competition Puts Artificial Intelligence, Talent, and Human Judgment in the Dock
-
Recruit41 brings hiring into Copilot, ChatGPT and Claude with MCP integration
-
HKBU Symphony Orchestra Annual Gala Concert to integrate digital technology and artificial intelligence
-
AI Under Investigation: The Role of ChatGPT in Tragic Shooting
-
Florida Investigates ChatGPT's Role in FSU Shooting Tragedy