ChatGPT

Government must do more to regulate AI in education

Lack of a clear government strategy to ensure safe exploration of AI in education puts children at significant risk of harm, writes Victoria Nash

Lack of a clear government strategy to ensure safe exploration of AI in education puts children at significant risk of harm, writes Victoria Nash

6 May 2023, 5:00

We’ve come a long way since electronic calculators revolutionised individuals’ ability to complete challenging maths problems without pen or paper. With the advent of ChatGPT, Dall-E and a huge host of ‘edtech’ apps and programmes, important questions are once again being asked about the appropriate role of technology in classrooms.

The potential benefits of these new AI-driven tools are extensive: greater personalisation for learners, measures to reduce teacher workload, and opportunities to prepare learners for the jobs of the future. But the stakes are incredibly high and an unreflective approach to adoption of educational AI could result in significant harms to children as well as destroying trust in innovation.

So far, the government has indicated a largely hands-off approach to regulating AI, framing this as fundamentally ‘pro-innovation’ and preferring to leave the matter to existing sectoral regulators. In the education context this seems risky, not least because it’s unclear which regulator could take responsibility. This leads us to ask: shouldn’t government regulate now to ensure safe and positive use of AI in schools?

The EU has somewhat of a head start on this, and a UK strategy could usefully build on measures developed there. The draft European AI Act adopts a risk-based approach, imposing the strictest obligations where the risks of AI are greatest. Specific mention is given to AI in educational contexts where use of such tools could alter someone’s life course, such as in exams or assessments. Companies producing these higher-risk AI products will be obliged to abide by rules such as mandatory risk assessments, use of high-quality training datasets and provision of additional documentation to enable oversight of compliance.

In addition to this draft law, the EU has also produced detailed guidelines setting out key considerations to support positive use of AI in schools. These include both ethical principles and recommendations necessary to ensure that the technologies are trustworthy. This seems like a good starting point for any UK strategy.

The stakes are incredibly high

In terms of ethics, the guidelines suggest that school leaders and educators should consider whether use of the proposed AI tools leave room for human decision-making, whether they treat everyone fairly, whether they respect human dignity and whether they provide transparent justifications for decisions made. In order to ensure the tools are trustworthy, the guidance highlights the importance of accreditation and the need for companies producing educational AI tools to observe similar ethical principles.

Accreditation or certification seems like an obvious move. Indeed, its absence is rather shocking. Playground equipment must meet appropriate safety standards and should be installed by expert contractors. Reading schemes are designed for specific learning outcomes and based on well-researched pedagogy. In contrast, anyone can currently design an AI-based educational app or service without any specialist educational input, and that tool can be rolled out to a whole class or school – and many are. Greater oversight is vital if we are to ensure first that such technologies are safe, secure, reliable and just as importantly, that they deliver real benefits to learning.

The final aspect of a comprehensive regulatory strategy should be support for more effective data governance. As the Digital Futures Commission noted, this is vital if children are not to be exploited commercially whilst gaining an education. Many digital services and apps harvest huge amounts of data from their users, often in lieu of payment, whilst the terms of service explaining this are painfully obscure. Navigating such data protection responsibilities is complex, and schools are poorly resourced to manage this, both in terms of expertise and infrastructure. Investment and training in data protection is definitely needed, as well as provision of more government support and advice.

Together, these three components – ethical guidelines for design and use, certification or accreditation, and comprehensive data governance – would provide a decent starting point for a UK strategy that would enable safe, positive and effective exploration of the great potential of AI technologies in schools. It’s vital that this conversation starts now.

Professor Nash spoke on the subject of AI and education at the Oxford Forum event, presented by Oxford University Press, on 24th April. You can watch a recording here

Latest education roles from

Assistant Director: Apprenticeship Development and Employer Engagement | Birmingham City University

Assistant Director: Apprenticeship Development and Employer Engagement | Birmingham City University

Birmingham City University

Lecturer A Supported Internship Tutor

Lecturer A Supported Internship Tutor

Bolton College

Theatre Production Technician

Theatre Production Technician

Capital City College Group

Sessional Lecturer / Teacher / Assessor

Sessional Lecturer / Teacher / Assessor

Merton College

Stained Glass Variable Hours Tutor

Stained Glass Variable Hours Tutor

Richmond and Hillcroft Adult & Community College

Secondary Higher Level Teaching Assistant

Secondary Higher Level Teaching Assistant

Ark John Keats Academy

Sponsored posts

Sponsored post

Navigating NPQ Funding Cuts: Discover Leader Apprenticeships with NPQs

Recent cuts to NPQ funding, as reported by Schools Week, mean 14,000 schools previously eligible for scholarships now face...

SWAdvertorial
Sponsored post

How do you tackle the MIS dilemma?

With good planning, attention to detail, and clear communication, switching MIS can be a smooth and straightforward process, but...

SWAdvertorial
Sponsored post

How can we prepare learners for their future in an ever-changing world?

By focusing their curriculums on transferable skills, digital skills, and sustainability, schools and colleges can be confident that learners...

SWAdvertorial
Sponsored post

Inspiring Education Leaders for 10 Years

The 10th Inspiring Leadership Conference is to be held on 13 and 14 June 2024 at the ICC in...

SWAdvertorial

More from this theme

ChatGPT

ChatGPT: One in three teachers use AI to help with school work

Proportion of teachers and leaders reporting using the technology has doubled in five months

Freddie Whittaker

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *