Examinations are a defining element of educationpolicy in England. Their data is the basis of selection for prestigious university places, for employment, and is used to evaluate school performance and teacher effectiveness.
When the pandemic struck, governments around the globe faced the same dilemmas: how to assess without fuelling the public health crisis. Only in the British Isles was the policy selection made to standardise teacher judgments statistically; through what became known as ‘the algorithm’. In China, for example, the examinations went ahead, but were delayed. As one of our Masters students from China predicted at the time: ‘One thing is for sure, the Gaokao exams will still take place.’
Policy decisions made in the crisis of the pandemic were shaped by the competing political frameworks constructed by political leaders. The political u-turn to withdraw the standardised grades and replace them with teacher estimates in England in 2020 was made in response to the public reaction to inequalities in the process.
These inequalities were especially visible in schools with small numbers of pupils, as the algorithm did not adjust these scores. As a result, private school pupils (where classes are usually smaller than in state schools) retained teacher estimates, which were usually higher than adjusted scores. The chair of Ofqual, Roger Taylor, later said that he could not understand how they thought that the algorithm would work.
As educational assessment, policy and politics researchers, we set out to investigate the kind of knowledge and expertise that contributed to the key decisions in 2020: cancelling the examinations, opting for an algorithmic allocation of grades, and the policy u-turn. Our findings have recently been published in the Oxford Review of Education.
We investigated policy documents, including SAGE committee minutes, Ofqual board minutes and Hansard. Additionally, we interviewed 16 key policy insiders from the regulator, Department for Education, exam boards and head teachers.
At each of these moments, we found that decisions were taken in different parts of the system with distinctive sources of knowledge and expertise. The prime minister announced the examination cancellation with little consultation with educationalists. The SAGE minutes barely mention education. Choice of the algorithm by Ofqual was informed by a committee of educational and assessment experts. Facing hostile reaction from the public, the Secretary of State, Gavin Williamson, initially blamed Ofqual, saying that he had not been informed about how it would operate. But the Minister had asked for the algorithm when he had instructed Ofqual to make sure that the results were similar to those in previous years.
This government priority reduced the sources of knowledge and expertise available to a technical, data-driven approach. Reverting to the teacher estimates was a purely political decision, shaped by political calculations and pressures involving the public, the media, Conservative backbenchers, political advisors and inter-department civil service politics.
Of course, decisions made in crisis are time-sensitive and often dependent on a close inner circle. But opportunities for learning from other systems or from a wider range of experts were not taken. The headteachers we interviewed said that they could have held socially distanced examinations in their almost empty schools or in other local, empty offices and buildings. But our research showed an absence of consultation with local authorities or the teaching profession.
Making policy processes transparent involves examining what kinds of evidence are in play in policy making and how that evidence interacts with politics. Research has typically focused on individual policies and has often been silent on the role of politics in framing education policy agendas. Our research programme seeks to contrast the way that knowledge and expertise are mobilised in policy across the four nations of the UK. Not only are the politics different, but the way in which policy is informed by different kinds of people, institutions and disciplinary knowledge also differs.
In England, our research points to a fractured decision-making process with problems that must be overcome, not least because further crises can’t be ruled out.