Artificial Intelligence

What medicine can teach us about regulating AI in education

The critical issue is accountability. If an AI tool gives inaccurate results, who is responsible?

The critical issue is accountability. If an AI tool gives inaccurate results, who is responsible?

2 Sep 2023, 5:00

Teachers will be very familiar with the power of analogy. There’s nothing quite like it for causing pennies to drop and once impenetrable ideas or difficult new concepts to suddenly become crystal clear.

The reason analogy works so well is because the best way to understand new ideas and concepts is to relate them to something we already know and trust.

In our response to the DfE’s AI consultation, I recently likened AI tools to actors in the BBC drama, Casualty; they look and sound like doctors, but cannot actually do surgery in a hospital to make people better.

Like the actors, AI tools might use convincing medical language, but they couldn’t – and shouldn’t – ever insert the scalpel.

Taking this analogy to its next logical step, we think of AI regulation much like the way medicines are controlled.

Any medicine new to market has to be tested and go through rigorous approval procedures. Even when a new cancer ‘wonder drug’ appears, it will still undergo years of trials, due diligence and be tested thoroughly because a bad medicine is likely to be worse than nothing at all.

And even when it makes it to the pharmacist’s shelves, it contains a leaflet of contraindications – listing all its risks and shortcomings to help you decide whether and how to use it.

The objective of these regulations is clear: to improve patient outcomes, in particular longevity and quality of life.

Similarly, we think there are three key things we need to achieve through regulation if AI is to successfully improve outcomes in the education sector. AI must:

  1. Add clear value to teaching, learning and assessment, avoiding common pitfalls such as bias and information that is harmful or inaccurate
  2. Have equality of access, a level playing field, so disadvantaged schools do not get left further behind
  3. Not inadvertently put schools at risk, for example with intellectual property issues.

The critical issue that regulation must address is accountability

Of course, some regulation is already in place, and the government’s white paper has looked to grasp the issue. It recognises that public trust is absolutely key, and that trust will be undermined unless the risks and wider concerns about the potential for bias and discrimination are addressed. It sets out five principles to guide and inform the responsible development and use of AI in all sectors of the economy.

One of these principles is fairness, which is clearly critical in the education sector. But to really build that all-important trust, we need more specifics.

Returning to medicine, new products are first approved for safety by the MHRA regulator. Then, if a product is to be approved for use in the NHS, it also has to be approved by NICE, which looks at the product’s cost-effectiveness.

We could envisage a general AI regulator performing the equivalent of the MHRA’s safety check, with the DfE then being responsible for a check on the educational effectiveness of the products, somewhat analogous to the role of NICE for the NHS.

This could even include kite-marking of products that are shown by research to have value educationally, working with the Education Endowment Foundation, awarding organisations such as AQA, and groups of schools such as those who recently wrote in to The Times, led by Antony Seldon.  

This might also helpfully include some hands-on, meaningful guidance and training for schools. In particular, we believe schools in more disadvantaged communities must have the same opportunity to benefit from AI as every other school.

Ultimately, the critical issue that regulation must address is accountability. Who is responsible for the behaviour of AI? If a teacher uses an automated marking system and it gives inaccurate outcomes, who is responsible: the teacher, the school or the software developer?

By treating AI like we do medicine and adopting a similar regulation framework, we can build much-needed trust, confidence and fairness, and protect students, schools and teachers by providing clear guidelines and accountability.

Latest education roles from

LECTURER – A LEVEL CHEMISTRY

LECTURER – A LEVEL CHEMISTRY

East Sussex College

Digital Pedagogy Team Leader

Digital Pedagogy Team Leader

Barnsley College

Learning Support Assistant

Learning Support Assistant

The Chalk Hills Academy - Part of the Shared Learning Trust

GCSE Maths Teacher (Mat cover)

GCSE Maths Teacher (Mat cover)

Barnsley College

EA to the CEO & Senior Directors

EA to the CEO & Senior Directors

Haberdashers’ Academies Trust South

Head of Faculty (History and RS)

Head of Faculty (History and RS)

Ark Greenwich Free School

Sponsored posts

Sponsored post

How can we prepare learners for their future in an ever-changing world?

By focusing their curriculums on transferable skills, digital skills, and sustainability, schools and colleges can be confident that learners...

SWAdvertorial
Sponsored post

Inspiring Education Leaders for 10 Years

The 10th Inspiring Leadership Conference is to be held on 13 and 14 June 2024 at the ICC in...

SWAdvertorial
Sponsored post

Inspire creativity in your classroom. Sky Arts’ Access All Arts week is back!

Now in its third year, Access All Arts week is a nationwide celebration of creativity for primary schools (17-21...

SWAdvertorial
Sponsored post

Unleash the Power of Sport in your setting this summer! National School Sports Week is back!

Unleash the Power of Sport this summer with National School Sports Week powered by Monster Kickabout! From 17-23 June,...

SWAdvertorial

More from this theme

Artificial Intelligence

Scientists wanted to help DfE ‘shape future of education’

The Department for Education is creating a new science advisory council and is looking for members

Samantha Booth

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *