Exams

To be fair, we can do better than exams

13 Aug 2021, 9:30

Ministers’ clamour to return to our pre-pandemic exams system is founded on an impoverished idea of fairness, writes Jo-Anne Baird

Schools Minister, Nick Gibb, said on a BBC interview yesterday that comparing this year’s GCSE results with pre-pandemic outcomes was like comparing apples and oranges. He also believes that exams are the fairest and best way to assess. But what evidence is there for that statement?

The logic that exams are fairest comes from the notion that everyone sits the same question papers under the same conditions. Everyone has a fair shot to achieve, with results based on merit.

Except, this is only one definition of fairness, and the meritocratic worldview it’s based on is clearly flawed. When people can pay for a better education, or have more privileged educational experiences because they are wealthier, exam results reflect that.

The pandemic has thrown into stark relief the different educational experiences young people can expect. Nearly one third of private schools provided four or more lessons a day during the first lockdown, but only 2 per cent of state schools were able to do the same. Sitting the same exam would not come close to sorting out this level of unfairness. These experiences only exacerbate existing inequalities.

A different definition of fairness would entail openness to diversity

A different definition of fairness would entail openness to diversity and allowing for different ways of achieving the required standards. It might even allow people to work hard, try again and re-sit assessments, perhaps in a modular fashion for some subjects. It might also mean more variety in the qualifications diet, such as more vocational qualifications that equip young people with the skills to go onto employment in sorely needed sectors such as plumbing, IT or healthcare.

Exams are good for selection. They can be designed to discriminate between pupils, to give them a rank-order. But there are big question marks over whether this is a sensible primary function of our school qualification system.

Introduced in 1988, GCSEs were the school-leaving exams. But that is no longer the case. Young people now stay in education and training until age 18, so allocation to courses at the next level could be handled differently.

Equally, many universities have entrance exams in specific subjects. A-levels – the new de facto school-leaving exams – have not fulfilled the selective function for quite some time.

It’s time for a rethink. This is why I have joined the National Education Union’s Independent Assessment Commission, which reports in the autumn on principles for a new assessment system. We are seeking a new ERA – an equitable and reliable assessment system.

During the pandemic, it was teacher professionalism that saw the qualifications system right. They put in the extra hours to make sure the results were as fair as possible and based upon evidence. We should be investing in that professionalism for the future. This year’s TAGs certainly proved onerous, but with the right design and a proper consideration of workload, new forms of assessment could be made manageable.

Sadly, the current Initial Teacher Training Market Review proposals do nothing to promote teacher professionalism. If we want a high-stakes assessment system that is fit for purpose, it will have to rely on having the best teachers to make it work, and that means they will need to understand assessment. So rather than an isolated review of teacher training, what we need is a forward-looking review of our qualification system as a whole, which may well include exams.

There has been a welcome stability in our qualifications system for a few years, but it is time to re-evaluate and build a better one – one in which technology plays a larger role, at the very least.

If our qualification system is to embrace the diverse needs of modern Britain and embody a wider form of fairness than Nick Gibb’s proposition, we need to recognise that exams are not the only fruit.

More from this theme

Exams

Unions: ‘Clunky’ advanced British standard risks ‘blunt choice’ for pupils

Ministers accused of 'putting the cart before the horse' with 16-19 reform plans

Freddie Whittaker
Exams

DfE puts 40 staff on Advanced British Standard ‘vanity project’

Government criticised for committing 'platoon of civil servants' to policy unlikely to come to fruition

Freddie Whittaker
Exams

Cyber attack: Exam boards told to introduce new security measures

Ofqual chief Sir Ian Bauckham said regulator will undertake 'rigorous' checks on exam board plans to move tests on-screen

Samantha Booth
Exams

AQA to launch free digital maths tests for schools

But England's largest exam board has delayed plans to introduce on-screen exams

Freddie Whittaker
Exams

Exam paper cyber attack investigation hits dead end

Two arrested stood down from bail as 'no further evidence'

Samantha Booth
Exams

Ministers mull scrapping Gove’s Russell Group school metric

It follows calls from a House of Lords committee to review destination measures

Samantha Booth

Your thoughts

Leave a Reply to Janet Downs Cancel reply

Your email address will not be published. Required fields are marked *

3 Comments

  1. Janet Downs

    Exactly. I’ve repeated this slogan so often it’s likely it will meet rolling eyes. But I’ll say it again: move towards graduation at 18 via multiple routes.

  2. Mervyn Benford

    Fairness must start with how grades awarded statistically and professionally reflect age difference even within a single year group. The summer-born failure syndrome has been around since I took the 11+ in 1948. The DfE has acknowledged every strategy so far has failed and it still obtains. The IFS of all organizations decided to study it- relating to employers and work no doubt- and decided that converting raw scores to month of birth was the solution. This was always the commercial statistical professional strategy: every Maths, English, IQ test I used over 25 years in teaching had tables to do just that. Exam grades so adjusted would make summer-born failure disappear.

    As for fairness what is fair about exam seasons in the middle of hay fever months. I saw how difficult it was for my daughter to revise, sleep, attend school wheezing and sniffing and a puffer everywhere and then she has to sit hours alone at a desk and somehow answer complex questions. Hay fever may not be the only problem of exam days- other regular conditions may occur that distract attention.

    Finally, as a Head I learned of three strategies at university designed to make their exams more telling: peel-able labels that gave a rel;relevant hint but reduced overall mark- this allowed students to show they actually knew some of what was required rather than be stuck and lose all the marks for that question. And degree exams with standard papers but 3 days not 3 hours. Let them discuss it! Good marking will still sort out those with the better overall grasp.

    I fear the system is stuck in its antiquity and shame. A former Head of government moderation said c.2007/8 “Our exam system is diseased and almost corrupt.” The DfE said the IFS month of birth standardization would solve the problem but employers prefer traditional exams! Over decades employers have actually complained more about not producing enough who can think for themselves, make decisions, take responsibility, work in teams and solve problems. Exams simply ignore such intellectual talents.That DfE report nevertheless added that on average 10 000 GCSE grades each year were wrong!

    • Mr Benford: “on average 10 000 GCSE grades each year were wrong” is a substantial underestimate. Over recent years – when there were exams – about 5 million GCSE grades were awarded. annually. Of which not 10,000 but about 1.25 million were wrong. 1,25 MILLION.

      I’ m not making that up. Ofqual’s own research shows that exam grades are about 75% reliable, and on 2 September 2020, the Chief Regulator admitted to the Select Committee that exam grades are “reliable to one grade either way”. For a reconciliation of that statement to the 75% reliable (= 25% unreliable), see https://rethinkingassessment.com/rethinking-blogs/just-how-reliable-are-exam-grades/