Don’t let AI change what it means to teach
Technology is changing the work of teachers, but education must remain human-led.
Artificial intelligence (AI) is reshaping classrooms worldwide. It drafts lesson plans, suggests learning activities and tracks student progress, among many other capabilities. Once an optional assistant, AI is now positioned by tech companies as an active agent in teaching and learning.
In Singapore, this shift is especially pronounced. The latest Teaching and Learning International Survey (TALIS) found that three in four teachers here adopt AI to teach or facilitate student learning, more than double the global average of 36 per cent.
This doesn’t include non-AI digital resources that teachers regularly employ in their teaching, ranging from gamification to cloud-based classrooms and recorded content. As a profession, teachers’ proficiency in deploying technology in teaching has increased dramatically since the pandemic.
Yet behind the rapid adoption of digital solutions lie uncomfortable questions. Are teachers being forced to become less like educators and more like tech operators? What does it then mean to be a teacher and to teach?
AI and its paradoxes
At the beginning of my career, I was a research scientist – a protein crystallographer deriving protein structures. This was a slow, meticulous process that took years of experimental work by an international team to complete. Then in 2018, Google released AlphaFold, Google’s AI system that can predict complex protein structures in just a few weeks.
Overnight, the experimental work I once did became obsolete.
Astronomers I’ve worked with tell similar stories. Many of these scientists had built their careers around time spent at the telescope. With automation, they now analyse data collected by machines they don’t touch, and this changes how they view their profession as astronomers.
The talk is that teachers now face a similar transformation in how they perceive work. Technology doesn’t just change what teachers do; it also affects how teachers view themselves.
Our professional identity and sense of competence are often tied to our professional values and what we do and contribute at work. When AI or automation challenges that basis, it can feel deeply personal. New systems can, at first glance, devalue a human’s expertise, something that one spends years developing.
AI can take over repetitive tasks, freeing teachers’ time for more meaningful work. It can also provide students with instant feedback.
But if AI is now offering “personalised feedback”, what happens to the teacher’s role in guiding, encouraging and stretching students through difficult learning? More crucially, will educators begin to feel detached from the affective and relational essence of teaching?
Reading the room
In 2020, I led a study of how university teachers transformed their teaching in the first weeks of lockdown. One lesson was immediate: teaching is not just transmission of content. More importantly, it involves human interaction, interpretation, care and response, guided by human signals.
Body language and facial expressions are vital visual clues that help teachers read the room, adjust explanations, provide encouragement and know when a student is lost or ready to move on. But when students switched their cameras off, some teachers felt stripped of that sensory feedback.
As one teacher said: “I cannot see the (students) to gain clues as to engagement and understanding.” The problem was not merely technical. It was cultural and emotional. Teachers were being asked to teach without the human cues that make teaching possible. And yet what stood out most was what teachers did next.
Despite the isolation from the students, they worked relentlessly to rebuild connection, designing interactive tasks, using digital tools to prompt dialogue, checking in on students who were disappearing behind black screens. This was not algorithm but care.
AI does not hone social emotional intelligence. It cannot notice discouragement in silence, or respond to tremor in a voice, or decide that a struggling student needs encouragement more than correction. If we treat feedback and teaching as purely informational, we erase the very thing that makes learning human.
Digitalisation doesn’t always reduce workload; sometimes it just reshuffles it. Every new tool creates “metawork” – unseen labour that sustains the system. As students use AI, teachers now spend time verifying authenticity, redesigning tasks and defending academic integrity. This is necessary work, but it is invisible and unacknowledged work. And when invisible work piles up, burnout may follow.
AI is not just changing classroom routines. It is also quietly shifting power and professional meaning. If schools adopt AI platforms uncritically, the most influential people in education will no longer be educators, but technologists and vendors, often far removed from classroom realities and pedagogical ethics. That is not progress. That is a transfer of authority away from those who understand learning best.
Teachers must remain the ethical and pedagogical leaders of this transformation because at the heart of teaching is meaningful and irreplaceable human work.
What it means to teach
If we want to know where AI belongs in schools, we have to be honest about what teaching is. Teaching isn’t a bundle of tasks. It’s a demanding set of cognitive, emotional and social practices that machines can assist with but not replicate.
Teaching is real-time sensemaking. Teachers constantly interpret subtle cues such as shifts in attention, hesitation, confusion or sudden insight. They adjust explanations, reshape activities and change their pace based on what they observe.
For example, when I’m teaching, I watch how the students respond and adjust my interactions with them accordingly. I’m not simply giving them “personalised feedback” based on what they understand. Using emotional judgment I also interpret how they feel – confident, nervous, anxious – and I try to interact with them in an appropriate way.
AI can analyse outputs, but it cannot understand the meaning of what unfolds moment by moment. AI can analyse work after the fact; it cannot grasp what learning means as it unfolds.
Second, teaching is a deliberate design. Teachers sequence concepts and ideas, anticipate misconceptions, frame productive questions and construct sequences that help students develop their understanding. They know when to challenge students, when to hold back and when to reframe ideas. These decisions rely on human insight, not automated pattern-matching.
In my experience, challenging a student who already lacks confidence can negatively impact their progress, even if they understand concepts. They need to be nurtured until they are ready for a challenge. AI can’t gauge this inflection point.
Third, teaching shapes the emotional climate in which learning happens. Students thrive when they feel safe, seen and motivated. Trust, fairness and encouragement aren’t add-ons, they are the medium of learning. Creating trust and motivation requires professional judgment, not pre-programmed responses.
These elements sit at the heart of human identity. They explain why teaching cannot be reduced to the management of digital tools, data analysis or the supervision of automated feedback.
How technology should support teachers
The Singapore education system has long insisted that the purpose of schooling is not just competence, but also character; not just achievement, but also human flourishing. That gives us a compass as AI enters classrooms.
The task now is to hold that line purposefully. Teachers should welcome tools that lighten the load and sharpen insight. They should refuse any future where teaching is reduced to managing platforms or policing outputs. AI must be designed with teachers, not imposed on them; governed by educational values, not market logic; and deployed to extend human care, not replace it.
One example is the Navigo game, a literacy tool that features over 900 individual game activities designed to cover a wide range of the school reading curriculum. Led by my colleague, Professor Mina Vasalou from the UCL Knowledge Lab at University College London, the resources were co-designed in collaboration with teachers and over 5,000 children across six European countries. This award-winning project demonstrates that teachers, students and their parents are important stakeholders who must be co-creators if the technology is to address their needs.
Schools and the education sector mustn’t take a “data-driven” or “technology-first” approach to the future of teaching. What it means to teach in the age of AI should involve finding ways to put humans first.
So let’s be clear about the direction of travel: technology must adapt to teachers, not teachers to technology. AI should serve professional judgment, not supplant it. If we hold firm to that, Singapore can show that the future of education can be both AI-supported and deeply human.
As we advance, let us hold firmly to the values that make us human, because they will guide how we use technology and ensure it serves people, not the other way around.
Allison Littlejohn holds the National Institute of Education’s Dr Ruth Wong Professorship. She is also pro vice-provost for the Data Empowered Societies Grand Challenge and Professor of Learning Technology at University College London.
School for Humans is a new Opinion series in January that aims to deepen the conversations around education and highlight the human forces at the heart of teaching and learning.
Read the original article here.
Source: The Straits Times © SPH Media Limited. Permission required for reproduction.


