Artificial intelligence (AI) is changing schools in a way that goes far beyond software procurement or a new digital policy. The schools making meaningful progress are not treating AI as a gadget or a bolt-on initiative. They are approaching it as a leadership question, a teaching question, and ultimately an equity question. “You need to be making teachers comfortable and confident with using AI so that it’s a tool for learning and not a substitute,” says Brian Cooklin, the former Executive Director of Nord Anglia Education. Smart technology is redefining learning only when schools rethink the systems around it, from teacher development and assessment to parent communication and student thinking.
Many schools start with the tool rather than the purpose – a common mistake in Cooklin’s view. “It comes back to this question of seeing the AI development as a strategic purpose and therefore to have a strategic intention, so you have to design it from the root up,” he says. That means policies, processes, user guidelines, codes of conduct, and a development plan backed by real budget.
This is where his international leadership lens matters. Having led schools and regions across multiple education systems, Cooklin frames AI adoption as an operational challenge as much as a pedagogical one. A school can buy a platform, but that does not mean teachers will know how to use it well. Nor does vendor training solve the deeper issue. Technology companies are often effective at showing what their product can do, but “how do I now use it in teaching?” remains the bigger challenge.
The schools moving fastest are building AI into continuous professional development rather than relying on one-off workshops. Sustainable change comes from embedding expertise inside the school through digital leads, peer support, and professional learning communities. “There’s no point having a one off course,” he says. “We all know that doesn’t work, it doesn’t stick.” In practice, that means pairing confident staff with hesitant colleagues and creating structures where teachers share what works in real classrooms.
The Real Opportunity Is Personalization at Scale
For all the attention on efficiency, Cooklin sees AI’s greatest promise elsewhere. “There is no doubt that this technology has the power to close the gap,” he says, particularly through individualized learning. In that sense, AI is not merely a productivity tool for schools. It is a way to deliver the differentiation educators have long talked about but often struggled to provide consistently.
Cooklin returns repeatedly to students with additional needs. For children with dyslexia, autism, or other physical or learning challenges, adaptive tools can remove barriers and create more precise pathways through the curriculum. He describes this as “the holy grail” of teaching: helping each learner work at the right level and move to the next one with confidence.
There is also a practical leadership dividend. AI can reduce some of the administrative load that drains teacher time, from report writing to lesson planning and routine paperwork. The strategic value is not that machines replace educators, but that they free educators to focus on human intervention. The teacher’s time can then be spent where it matters most, supporting learners, asking better questions, and responding to individual need.
Trust Depends on How Schools Bring Parents Along
And when it comes to increasing engagement and support? “Communicate, communicate, communicate. There’s no substitute for it,” he says. For school leaders, that begins before implementation, not after concerns arise. Parents, staff, and students need to be part of the discussion from the start, especially when the change touches something as sensitive as how children learn.
His approach is rooted in transparency. Parents are often anxious not only about AI, but about screen time, smartphones, and the social effects of digital life. Schools therefore need to explain what the technology is for, where its limits are, and how it will improve learning in concrete terms. Showing how a student with dyslexia has progressed through an AI-supported intervention carries more weight than any slide deck.
He also makes a subtle but important point about credibility. Parent workshops are valuable, but they are most persuasive when students help lead them. “If the student is able to explain how this is helping them, it carries an awful lot more weight,” he says. That is less a communications tactic than a leadership principle. Change lands better when it is visible in the learner’s experience.
Assessment Will Shift Toward Process, Judgment, and Human Thinking
If AI can generate passable homework in seconds, traditional assessment becomes harder to defend. Cooklin expects schools to move back toward supervised, in-class assessment and live demonstrations of learning. “Show me how you did this in class now,” he says, describing the kind of evidence teachers may increasingly seek. But the more significant shift is philosophical. Rather than grading only the finished product, schools may need to assess reflection, judgment, and the quality of a student’s thinking about AI output. “You are testing their critical thinking, their judgment, to some extent, their metacognition,” Cooklin says. That is especially relevant as agentic AI becomes more capable.
For Cooklin, this is the why behind the entire debate. Schools that respond well will protect what is distinctly human in education. “Critical thinking and creativity are things that keep you human and separate from agentic AI,” he says. The future of learning, then, will be a test of whether school systems can use smart tools to elevate deeper thinking rather than automate it away.