Robotics and AI
When human outshines machine
In a world increasingly shaped by artificial intelligence, we are often dazzled by what machines can do. They can generate lesson plans in seconds, answer complex questions with fluency, and tailor learning pathways with breathtaking precision. But there are moments in every school day—quiet, unmeasurable, irreplaceable—when it becomes clear that education is still, at its core, a deeply human enterprise.

I think back to a student who lingered after class recently, notebook in hand, ostensibly with a question about an assignment. AI could have given her a dozen better model answers than I did in that moment. But that wasn’t the point. What she needed was not a perfect answer but a person—someone who would listen, reassure, and remind her that she wasn’t falling behind, that she was seen. No dashboard notification or predictive model could have picked that up.
These small, easily overlooked moments are reminders that while AI may enhance how we teach, it cannot replace why we teach. The real magic of the classroom lies not in the speed of feedback or the scope of data but in the relational space between student and teacher, where trust is built and confidence quietly restored.
The Barker Journey interviews reinforced this. Students were clear-eyed about AI: they saw its usefulness and were eager to explore it. But what they consistently returned to was the value of guidance, of conversation, of being helped to make good choices—not just efficient ones. They wanted to use AI, yes, but they didn’t want to be left alone with it. The presence of a teacher mattered—not to restrict their use of technology, but to humanise it.
This lesson has been especially clear in our work with refugee-background students. AI has been an extraordinary tool—able to provide multilingual resources in seconds, translate texts, and support translanguaging in ways that were previously time-consuming or impossible. It’s helped me create classrooms where students can access content in their own languages and build bridges to English learning with more confidence. But learning—real, lasting learning—comes from interaction. It comes from the quiet work of understanding someone’s backstory, listening to what is said and what is held back, and building a classroom that feels safe before it feels smart.
No algorithm has ever told me why a student doesn’t make eye contact when we talk about war, or why another child pauses every time we mention borders. That understanding comes from sitting beside them, not from scanning data.
So yes, I’ll keep using AI. I’ll keep exploring its capacity to support, scaffold, and enhance. But it will be important to do so with a clear sense of where its limits lie—and a clearer sense of what only teachers, only humans, can do. It is worth keeping a focus on this very question: not just how AI works in the classroom, but how it reshapes relationships, belonging, and the emotional texture of learning. It’s a question that reminds us that the future of education will not be built by algorithms alone. It will be shaped in the space between teacher and student, where empathy, trust, and understanding continue to matter most.

Dr Timothy Scott
Tim has held leadership roles in schools across Australia and abroad for 25 years, alongside teaching History and Modern Languages. His research focuses on intercultural learning and pedagogical translanguaging, refugee education, and student voice in improving educational practice. He is a lead researcher for the Barker Institute’s ongoing decade-long longitudinal study, The Barker Journey. Alongside his research work, Tim currently teaches History and Global Studies. His PhD examined socio-political influences on contemporary German conceptions of history and archaeology.