There are natural swings between year groups as talent and circumstance ebb and flow, suggests Loic Menzies, but under the current system, this skews perspectives on school success.
Every year, hundreds of secondary schools’ results swing by more than 15 percentage points. Some shift by 30 percentage points. This does not necessarily mean a particular school has got any better or worse, but it gives parents confusing information and can make or break teachers’ and heads’ careers.
Worse still, the drive to show quick improvement and headline-grabbing “turnaround” results can make dodgy practice and narrowing the curriculum look too tempting for some to resist. Yet we know education is a long game, and we know that cramming your way through years 6 and 11 is not the way to go.
The drive to show ‘turnaround’ results can make dodgy practice
So what’s the solution? ‘Testing the water’, a recent report on the future of assessment from LKMco and Pearson, suggests it’s surprisingly simple: headline measures should shift to three-year averages. This would give parents a far more realistic picture of a school’s performance, which would be sheltered to some extent from short-term gaming, the inevitable volatility and the lottery of how talented the pupils happen to be in each cohort.
Most schools’ results shift by only 2.5 percentage points from year to year according to a study by the exam regulator Ofqual. Where results are more volatile, this is generally due to unusually different attainment profiles between year groups. As ‘Volatility happens‘, a report by Cambridge Assessment, explains, “each year, different students enter year 11 at each school and they are not identical to the group of students that sat for the previous year’s GCSEs at that school”. The combination of chance, different pupils taking the exams and schools themselves changing, means year-to-year variation is to be expected.
To some extent, the government is aware of the problem. It has just published a response to calls for multi-year averages from the education select committee. The DfE says that it already publishes such figures and that it uses longer-term measures in its definition of ‘coasting schools’. However, it publishes this data as ‘additional measures’, buried deep in the league tables, where only the geeks will probe. Additional measures are not the measures that make or break careers.
Additional measures are not the ones that make or break careers
More reasonable is the DfE’s claim that publishing three-year averages is currently impractical because of recent changes to accountability measures. However, fingers crossed, the stormy waters of our constantly-changing education landscape will someday calm. When they do, longer-term averages are surely the way forward.
I remember sitting in a school leadership team meeting years back, pondering what to do about our year 9s. This cohort was considered academically weak and would be following in the footsteps of a stronger year group. This risked compromising a rising trajectory in our school’s performance. Of course we should have been discussing how to respond to these children’s needs, but such is the pressure to demonstrate constant improvement that the discussion was unhelpfully focused on what the headlines would say.
While my experience was at secondary level, this issue is even more of a concern for primary schools where small numbers of pupils can dramatically skew results.
A sustained drop or rise in results would still show in a three-year average, but a longer term measure would reduce the stakes. School leaders would then feel less pressured to respond to a weak year group by hothousing pupils, narrowing the curriculum or gaming. The change would not only be fairer to pupils, it would also make league tables far more meaningful. We are not the first to make this case: the National Association of Head Teachers, the Education Select Committee, and the Head Teachers Round Table have all made similar calls. It is now time government made the shift.
Loic Menzies is director of the think-tank LKMco. ‘Testing The Water: How assessment can underpin not undermine great teaching‘, was written in partnership with Pearson.