Ministers are scoping out how to ensure schools benefit financially from any future use of pupil data by artificial intelligence (AI) systems, Schools Week has learned.
The rapid rollout of generative AI such as ChatGPT and Google Bard has prompted a scramble across government to harness the technology’s power, but also to guard against any risks.
Such “large language models” could quickly process huge amounts of data, which experts say could help schools to understand their pupils better and analyse the impact of innovations.
Third-party organisations, including private companies, can already request data from the national pupil database for analysis.
The development of more and more sophisticated AI systems could make this analysis easier. However, ministers are understood to be concerned about any use of pupil data to generate profits for private companies without any benefit for schools and pupils.
Baroness Barran, the minister leading on AI for the Department for Education, told Schools Week ministers were “absolutely thinking about all of these issues”.
“It wouldn’t be truthful to say that we’re clear on what principles we will follow, but obviously, we are extremely sensitive and aware of the use of individual or aggregated pupil data. That’s clearly a real priority that we get that right”.
DfE probes data ownership and value
Barran said ministers were asking “a number of questions”, including on ownership of the data and “what’s it worth”.
“It’s about as complicated as anything I’ve ever looked at. But we’re working with people who are experts in data ethics and privacy, to really think through these problems.”
Gillian Keegan, the education secretary, told London Tech Week on Wednesday that AI was “transforming the world”, and that education must not be “left behind”.
Niel McLean, the head of education at BCS, the Chartered Institute for IT, said there were potential benefits to using AI and pupil data.
“If you build up a really large data model, and you train it using the pupil level data, then you can use that data model to help you understand your students as whole people.
“Everything matters. Their attendance matters, their performance matters, all those sorts of things. You’ve got a better sense of them as individuals. AI can do that. It can just help you just know your learnings better.”
But he urged ministers to think about “four Ps”.
DfE needs ‘clear public benefit statement’
“There’s an ethics of purpose – what you’re actually using this to do? There’s an ethics of processes – how is data handled? What’s the confidentiality? How secure is it? There’s a people side. You want the people doing it to be professional, and feel they’re accountable.
“The fourth P that came to my mind is the payback. Having a clear public benefits statement about giving that data to this entity, what does it deliver? And it shouldn’t just be financial return. It should be something that improves things for young people.”
But the DfE already faces questions about its approach to data-sharing.
A damning audit by the Information Commissioner’s Office (ICO) in 2020 found the department broke data protection laws in how it handled pupil data. The full report still hasn’t been released.
It was also reprimanded over a “serious breach” that allowed a company providing age-verification for gambling companies access to the personal information of millions of young people.
Jen Persson, from the campaign group DefendDigitalMe, said the department should “publish the evidence of today’s data reality before getting ahead of itself with imagined futures. The 2020 DfE ICO audit must be published in full, with a timeline for what remains to be done.
“And the DfE must commit to giving families control over the current commercial re-uses of their own and their children’s information from the millions of named records in the national pupil database, that few know exists.”