As the government consults on how schools can “get the best” out of artificial intelligence, Amy Walker delves into why the revolution hasn’t taken off in classrooms … just yet
“We’re telling staff members how to use it to inspire them,” said Jonathan O’Donnell, the trust’s lead IT consultant. “It’s up to them to implement it within their curriculums and practices.
“We’re looking at ways in which we can support staff with their workloads so they can really concentrate on teaching and learning.”
A government report in March shows nearly one in five teachers work at least 60 hours a week – and most spend less than half of that teaching.
One of the ways Harris staff are using ChatGPT is to rewrite text for pupils to make it more accessible for different age groups.
Microsoft Live is also being used to translate classes for pupils with English as an additional language, including Ukrainian pupils.
Teachers speak into a microphone and pupils can pick up subtitles in their chosen language on their devices.
Tools shaving ‘hours’ off workloads…
The picture is similar at Academies Enterprise Trust (AET), another of England’s largest trusts, which says it has been exploring AI use “for some time”.
It will provide schools with access to generative AI options in tools such as , which allows users to generate presentations from text prompts “in seconds”, and Google Classroom, which uses AI to provide pupils with real-time feedback.
In March, Joel Kenyon, a science teacher at Dormers Wells High School in Liverpool, told the Commons science and technology committee that he uses ChatGPT to produce good and bad examples of answers.
He also creates specific tasks with AI while lesson planning. “If you wanted to generate five key stage 3 questions on atoms, you could do that really quickly.” Kenyon told Schools Week. The process has shaved “hours” of his workload.
Real Fast Reports, which describes itself as the leading AI school report writer, boasts around 5,000 teachers in England among its users.
About 50 schools have signed up for its school package trial this term, the company says.
…But most teachers still haven’t used AI
While some schools are ploughing ahead with AI advances, figures collected by Teacher Tapp on behalf of education publisher Oriel Square, show just 17 per cent of teachers in April had used AI tools to help with school work.
Another 62 per cent had never used the technology.
Separate data collected from 500 secondary teachers by RM Technology, suggests that for most of the sector, it is a hindrance rather than a help.
More than half (56 per cent) told RM that they felt education professionals needed proper training, with nearly a quarter believing it added additional pressure on teachers because of its use by pupils. One private school said it was likely to scrap homework essays because of the potential for cheating.
But over a third of respondents felt the sector was not moving fast enough to adapt to it.
Lucas Moffitt, a former design and technology teacher, has recently set up “pedagogy infused” AI platform Teachology.ai. It aims to help teachers build lesson plans that can pull in educational videos and scientific journals from the web.
“If I could do that when I was teaching, I’d probably still be a teacher,” says Moffitt, who estimates the technology could save teachers “between four and eight hours” a week.
But the UK is currently its “slowest market”, with 1,300 current users. “If teachers aren’t able to find a way to harmonise with AI and include the benefits of what they’re doing, they’re absolutely going to fall behind.”
It’s not just about reducing workload
AI is also being used to enhance lessons too.
Nino Trentinella, the head of art and photography at Sutton Grammar School in Surrey, won the Pearson National Teaching Award for digital innovator of the year this week.
It came in part for her integration of AI into the curriculum, including to help children learn art history by using platforms such as Scribble Diffusion and NightCafe Creator to create artwork in the style of famous artists, or to brainstorm ideas for physical artwork by generating AI paintings first.
“I think it has a lot of potential, and we don’t even know what most of that potential is. But I think by the time they have finished school, this will have evolved millions of times. They really need to be ready and already thinking about how to use it,” Trentinella said.
Harris’s O’Donnell has also used it to create songs and poems about “less captivating” topics.
“It’s inspiring pupils to explore topics…that might have been a bit dry before.”
What might the future hold?
John Roberts, Oak National Academy’s director of product and engineering, thinks the adoption of AI will grow with more specialist applications.
“We’ll definitely start to see that embedded for solutions and products that support workflow, for example.”
When Oak starts publishing its new teaching resources on an open government licence this autumn, edtech and publishing companies using AI models will be able to adapt and use them for free.
The government’s flagship national teacher training initiative, the National Institute of Teaching (NIoT) is trying to work out how it could use the technology within professional development.
Callum Davey, its executive director research and best practice, says it is considering the use of AI to listen to trainee teachers’ presentations to check whether language and tone is “appropriate” for age groups.
What should we be worried about?
But Daisy Christodoulou, the director of education at assessment firm No More Marking, warns that conversations with heads and teachers makes her “worried that there’s quite a lot of misconceptions” about how large language models work.
In particular, the extent to which it is capable of making mistakes, which Christodoulou calls its frequent “hallucinations”.
A recent experience with ChatGPT, used to build No More Marking’s AI feedback site, has led it from a “position of huge optimism to relative scepticism”.
The site was intended to provide marking and leave feedback for in-year assessment, but made “big mistakes” and was “inconsistent” in results.
“We’ve been really clear with users that we don’t think it’s fit for purpose in terms of assigning a grade,” she says.
“With anything it does produce, you have to spend a lot of time scrutinising it.”
Roberts warns that given potential inaccuracies in the content produced, there are “obviously risks associated” where it is used in a class taught by a non-specialist.
“We want to make sure that highly-trained, specialist teachers are able to make sure the output is accurate,” he says.
DfE looking for ‘safe’ way to use AI in education
However, Morgan Dee, director of AI and data science at EDUCATE Ventures, says that potential inaccuracies and biases within AI-generated content are already in other information tools.
“When you use the internet or you’re reading a newspaper article, you should be thinking about all these things as well.”
Concerns have been raised about sharing personal information of pupils with such technology, including by the Information Commissioner’s Office, given unclear guarantees from tech companies about how information is protected.
OpenAI, which developed ChatGPT, Google and Microsoft, which runs Bing Chat, were contacted for comment.
Gillian Keegan, the education secretary, said in March that teachers’ day-to-day work could be “transformed” by AI, but it’s not yet at the standard needed.
Launching a call for evidence last week, she said responses would help the Department for Education “make the right decisions to get the best out of generative AI in a safe and secure way”.
The DfE has previously advised schools they “may wish” to review homework policies and that sensitive data should not be entered into AI tools.