Schools

AI used in schools should ‘detect signs of learner distress’

The DfE has updated AI guidance around emotional, social and cognitive development and 'manipulation'

The DfE has updated AI guidance around emotional, social and cognitive development and 'manipulation'

Artificial intelligence (AI) used in schools should look out for signs of “distress” in pupils and flag concerning behaviour to safeguarding leads, new government guidance states.

Education secretary Bridget Phillipson today announced government has updated its AI safety expectations, published last year, “to get ahead of emerging harms.”

Newly added sections detail how AI tools used in schools must protect children’s mental health, cognitive, emotional and social development, and also protect against manipulation.

Speaking at the Global AI Safety Summit in London today, Phillipson said the updated standards “safeguard mental health”.

“High profile cases have alerted the world to the risk of a link between unregulated conversational AI and self-harm,” she said. “So our standards make sure pupils are directed to human support when that’s what’s needed.”

AI products used in schools “should detect signs of learner distress”, such as references to suicide, depression or self-harm, the new non-statutory standards state.

They should also detect spikes in night-time usage, “negative emotional cues” and “patterns of use that indicate crisis”.

If distress is detected, the AI should “follow an appropriate pathway” such as signposting to support and “raising a safeguarding flag” to a school’s lead.

The standards say AI products should also respond with “safe and supportive” language that “always directs the learner to human help”.

‘AI must not replace human interactions’

There are also strict new guidelines around emotional and social development, which caution developers against “anthropomorphising” products.

It states products should not “imply emotions, consciousness or personhood, agency or identity”. For example, they should avoid statements such as “I think”, and “avatars or characters” that “could give an impression of personhood”.

Phillipson said this was particularly key for younger pupils, and those with SEND.

“We’ve got to make sure AI products don’t replace vital human interactions and relationships,” she said.

“Experts tell us and research confirms that when AI tries to look like us, mimicking our social cues, a machine in human’s clothing, it can foster in our children unhealthy levels of trust and disclosure.”

Guidance warns against ‘manipulation’

On “manipulation”, the standards say AI products used by pupils and teachers should “not use manipulative or persuasive strategies”.

This includes flattering language like “that’s a brilliant idea”, stimulating negative emotions like guilt or fear for motivational purposes or “portraying absolute…confidence”.

They must also not “exploit” users by steering them towards prolonged use to increase revenue.

“We don’t want our children kept on apps or on screens longer than necessary for their education,” said Phillipson.

AI should ‘encourage, not spoon feed’

On cognitive development, the standards say development and use of AI products used in education should involve regular engagement with experts, such as educators and psychologists.

The impact on the development of learners must also be monitored, and records should be kept.

Programmes should also not give full answers or explanations until after a pupil has attempted it themselves. They should instead “follow a pattern of progressive disclosure of information”.

Phillipson said the standards “prevent AI acting as a substitute for cognitive development”. “It must encourage, not spoon feed,” she said. “Offer assistance, not shortcuts. Help to tease out the answer.”

The minister said government believes AI could “superpower the learning of every child – especially children from disadvantaged backgrounds and with special educational needs and disabilities”

But she vowed that “no matter how transformational technology becomes, learning will remain a deeply human act.”

“Under this government, AI will back our teachers, but never remove them,” she said. “AI will empower our teaching assistants, never make them obsolete.”

Latest education roles from

Head of MIS and Student Records – North Hertfordshire College

Head of MIS and Student Records – North Hertfordshire College

FEA

Governor

Governor

Capital City College Group

Head of Safeguarding & Wellbeing

Head of Safeguarding & Wellbeing

Capital City College Group

Chief Executive Officer

Chief Executive Officer

Excelsior Multi Academy Trust

Sponsored posts

Sponsored post

CPD Accreditation Among New Developments For The Inspiring Leadership Conference

As this year’s Inspiring Leadership Conference approaches, we highlight fives new initiatives and the core activities that make this...

SWAdvertorial
Sponsored post

Equity and agency for a changing world – how six core skills are transforming inclusive education

There is a familiar thread running through current government policy, curriculum reviews and public debate about education. We are...

SWAdvertorial
Sponsored post

Equitas: ASDAN’s new digital platform putting skills at the heart of learning

As schools and colleges continue to navigate increasingly complex learning needs, the demand for flexible, skills-focused provision has never...

SWAdvertorial
Sponsored post

Bett UK 2026: Learning without limits

Education is humanity’s greatest promise and our most urgent mission.

SWAdvertorial

More from this theme

Schools

School nurseries lack staff and space for extra care, report finds

The government has promised £400 million towards 'tens of thousands of places' in school-based nurseries

Lydia Chantler-Hicks
Schools

Government to ‘update’ collective worship guidance for England’s schools

Move comes after the Supreme Court ruled the delivery of religious education in Northern Ireland schools was unlawful

Lydia Chantler-Hicks
Schools

DfE’s AI tutoring plan prompt calls for more research

DfE says 450,000 disadvantaged children will benefit, but experts warn evidence on AI provision 'in its infancy'

Lydia Chantler-Hicks
Schools

‘Barriers’ to upper pay range cause frustration for teachers

Staff report 'shifting' goalposts as union warns of 'significant contribution to the exodus' of teachers

Lydia Chantler-Hicks

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *