Skip to content

DfE invites bids from AI tutoring pilot partners

The new technology is already used extensively in the classroom but, warns a union, is no 'magic bullet'

Esmé Kenney

More from this author
6 min read
|

Listen to this story

Members can listen to an AI-generated audio version of this article.

1.0x

Audio narration uses an AI-generated voice.

0:00 0:00

The government has invited ed tech companies and AI labs to work with teachers to develop “safe, personalised” AI tutoring tools that will benefit disadvantaged pupils.

The AI tutoring trial was announced earlier this year, with the aim of creating fairness for pupils unable to afford private tutors.

Up to eight successful bids are expected for the scheme. They will have to show how their product will benefit disadvantaged pupils and how it will be accessible, inclusive and able to be used by pupils with different needs.

The tools will be “robustly tested” from this summer under teacher supervision, and co-designed with schools, with the aim to make the successful tools available nationally next year.

The Department for Education said up to 450,000 disadvantaged students a year in years 9 and 10 will benefit.

But there are concerns, with one union leader saying teachers are “far from convinced that AI tutors are a magic bullet”.

Schools Week spoke to leaders involved in the programme, as well as schools who already use AI, about the scheme.

More teachers using AI

The use of AI in classrooms is already on the rise, with a recent National Education Union survey showing that 76 per cent of teachers say they use AI tools for day-to-day work, up from 53 per cent last year.

The survey also found that 61 per cent use AI for resource creation, 41 per cent for lesson planning and 38 per cent for admin.

The new tools will be developed for years 9 and 10 across English, maths, science and modern foreign languages.

The DfE said the tools will adapt to a pupil’s needs, provide extra help when they need it and identify areas for further practice.

Katie Sharp, the director of education at the Great Schools Trust, said staff had largely welcomed AI “as a practical response to workload pressures”, allowing them to “focus more on high-value interactions with pupils”.

AI ‘an extra tool in the box’

Teachers within the trust have used AI to create deepfake avatars of themselves to deliver catch-up lessons for pupils who miss school.

“Rather than replacing professional judgment, the most effective use has been as an assistant, speeding up routine tasks while preserving teacher expertise.”

She added that staff “remain appropriately cautious about accuracy, safeguarding and over-reliance”.

Libby Hills, the founder of Ed Technical and a former headteacher, said AI would complement the work of human tutors, rather than being used as a standalone.

It would give teachers “an extra tool in the box to do some of the things that are harder for humans to do at scale on a regular basis, like instant feedback”.

Concerns raised about AI tutors

Despite the aims of the tutoring programme, the NEU survey found that only 14 per cent of its members in English state schools supported AI tutoring.

Forty-nine per cent said they disagreed with the policy, of which 25 per cent strongly disagreed, while 36 per cent had no opinion either way.

Responses from those opposing the scheme included concerns about AI undermining teacher relationships, that disadvantaged students needed in-person interaction, and that it was a cost-cutting measure to avoid giving schools more funding.

The survey also found two thirds of secondary teachers agreed that pupils’ critical thinking skills have declined because of AI, compared with 28 per cent of primary teachers.

Daniel Kebede, the NEU’s general secretary, said AI needed to be regulated so that schools “have appropriate tools that don’t undermine learning”.

“The profession is far from convinced that AI tutors are a magic bullet for closing opportunity gaps for disadvantaged students.

“AI will only improve learning and support teachers in their role if implemented correctly, within a vision of a highly skilled profession.”

Government benchmarks

The government has said it will develop new national benchmarks to check AI tools are accurate, age-appropriate and safe for pupils to use, and will work with teachers to create example classroom interactions and clear scoring criteria.

It is also opening access to its AI content store, which has a range of educational resources to support teachers’ use of the technology.

Hills said there needed to be some form of benchmark for the tools and that having access to a chatbot that was not tailored for educational use could have a negative impact on children’s learning outcomes.

Alex Russell, the chief executive of the Bourne Academy Trust, has been working directly with schools on the rollout of AI in classrooms.

He said he was encouraged by the government’s stance on being clear that ed tech products would be badged in some way for quality, as well as the fact that they were working with the sector to design these products.

Genie ‘out of the bottle’

The genie was, however, “already out of the bottle”. Schools needed to adapt quickly to give pupils the experience that matched what they had access to at home and prepared them for the world of work.

Sharp acknowledged that while AI could help support pupils who might struggle to access traditional resources, careful guidance was needed.

“Without it, there is a risk of passive use or dependency rather than genuine understanding.”

She said that the government should work closely with schools to develop tools “rooted in classroom reality”, prioritising the development of AI that supported planning, assessment and feedback.

“By doing this at government level, it means that it can be driven by the education sector for the education sector rather than corporate organisations driving the agenda.

“Alongside this, government should provide clear guidance on data protection, quality assurance and ethical use, giving schools confidence in what is safe and effective.”

‘Not all AI tools equal’

A huge market already pitching AI solutions to schools often is unprepared to identify what could be valuable and what would be a waste of money.

Sharp said staff could become overwhelmed by “the sheer volume of AI tools available”, and that many of them “are not fit for purpose as [they] can be too generic or misaligned with curriculum intent”.

In response his trust had trained its staff to use its own in-house AI tools, which were designed based on its own curriculum, schemes of work and trusted sources.

Hills also stressed that “not all AI tutors are equal”.

“There’s a huge distinction between someone who has just put a brand label on a lightly adapted version of ChatGPT, versus a company that’s spent a huge amount of time really thinking about those learning design principles, pedagogy design, safety.”

Russell said that schools needed to be “discerning buyers” of AI, and that teachers needed to “think about the problem they’re trying to solve and then buy accordingly”.

‘No one-size-fits-all solutions’

Sharp added that the most effective AI tools in schools were not generic, off-the-shelf platforms, but those developed with teachers, curriculum and specific pupils in mind.

“Schools are finding that bespoke AI, trained on their schemes of work, aligned to their curriculum sequencing and grounded in trusted sources deliver far greater impact than widely available tools.

“Rather than rolling out one-size-fits-all solutions, the sector would benefit more from enabling schools and trusts to develop and share their own AI tools and models.”

Share

Explore more on these topics

No Comments

Featured jobs from FE Week jobs / Schools Week jobs

Browse more news