Robotics and AI
Continuing the balancing act: AI and education in ongoing conversation
The importance of teachers and students engaging critically with AI is already well understood in schools. It is the starting point for any conversation about AI in education, even if only tacitly acknowledged. Our recent PD conference began from this shared understanding and then pressed us to go further: what does it look like to equip students to use AI as a tool for learning, while preserving independent thought and academic integrity?

While the conference was focused on professional learning for staff, these same questions are also surfacing in our research with students. The 2025 Barker Journey study, conducted separately from the conference, found that students want to use AI in their learning but are concerned about how to do so responsibly. They recognise the value of AI as a tool, but not as a replacement, and they stress the importance of maintaining creativity, independent thinking, and academic integrity. The alignment between what teachers are exploring and what students are saying is both striking and encouraging.
Of the sessions I attended, the one I found most useful was on Cogniti, a platform that allows teachers to develop AI agents tailored to student learning. This challenged me to design an AI “coach” to guide students through their IGCSE Global Perspectives Team Research Project. The idea is not that AI gives students the answers, but that it poses questions, prompts reflection, and scaffolds their thinking—from developing a research question to writing their reflective paper.
My first trial, where I posed as a student, revealed an important lesson: the AI’s feedback was too much, too fast. For this to be useful, it needs to “chunk” feedback into smaller, more manageable steps that students can process and respond to in their own time. I have a little more work to do on it! This is where AI shows both its promise and its challenge: the technology is powerful, but how we shape its interaction with learners will determine whether it deepens or diminishes critical thinking.
Other sessions broadened the picture. We explored AI as a research tool, experimented with the craft of prompt writing, and even trialled AI’s capacity to mark assessments. In this last workshop, assessments already double-marked by the teaching team were run through Microsoft Copilot. The results were striking. Copilot’s marks closely aligned with the teachers’, and the feedback it produced was both insightful and constructive. While no one suggested that AI could or should replace the teacher’s judgment, nor that there is going to be a policy change regarding who does marking in the future, the experiment highlighted its potential to complement marking by providing richer, faster feedback.
Perhaps the greatest value of the conference, however, came from the conversations that happened in between the formal sessions. Around coffee cups and corridor chats, colleagues wrestled with AI’s ethical and educational implications. One colleague expressed deep concern about whether AI tools draw on information that may be pirated or misused – a reminder that questions of intellectual property remain unresolved. Others reflected on AI’s potential to sharpen rather than blunt students’ critical thinking skills, provided we frame it carefully and encourage thoughtful use.
The biggest takeaway for me from the conference was not whether AI will shape education. It already is. Nor was it simply the reminder that educators must shape its use wisely; that much is clear. What struck me most was how often our conversations stay focused on immediate questions: Will AI change assessment? Will it speed up feedback? Will it help or harm? These are important, but they are too narrow. AI’s rapid growth is reshaping the world more profoundly—and more quickly—than many of us realise. The real takeaway, though, is that we need to look beyond these immediate concerns.
The comments of the Barker Journey students show the importance of this. Students are asking how to use AI responsibly and with integrity. Their questions differ from ours: while we often focus on tools and systems, they are already thinking about the world they want to live in. They see AI as part of that world, and they want the skills to shape its use for the common good. For educators, then, the task is clear. We are responsible for preparing the next generation to live and lead in a world where AI will be ever-present. Our role is not to prescribe the world they must build, but to help create the conditions for the world they wish to inherit.

Dr Timothy Scott
Tim has held leadership roles in schools across Australia and abroad for 25 years, alongside teaching History and Modern Languages. His research focuses on intercultural learning and pedagogical translanguaging, refugee education, and student voice in improving educational practice. He is a lead researcher for the Barker Institute’s ongoing decade-long longitudinal study, The Barker Journey. Alongside his research work, Tim currently teaches History and Global Studies. His PhD examined socio-political influences on contemporary German conceptions of history and archaeology.