David Blow explains why your Progress 8 scores may have been lower than you had predicted.
The move to Progress 8 has undoubtedly been fairer than the previous measure of five A*-C GCSEs, especially for schools with lower prior (key stage 2) attainment, who were likely to have lower outcome attainment (at key stage 4).
In terms of the government’s “floor” standard, about 150-200 schools that were previously below the floor (of 40 per cent achieving five A*-Cs including English and maths) benefited directly, as Progress 8 revealed the good work they are undertaking.
However, many schools were surprised to find their published scores lower than they had predicted, and certainly lower than the calculations they made in August, based on their actual results. The reasons are two-fold.
First, many schools announced their “Progress 8 score” in August from their MIS system, without being fully aware that the calculations were based on 2015 national Attainment 8 averages. The figures published in October by the Department for Education (DfE) were based on the 2016 national Attainment 8 averages, which were higher than 2015, especially for pupils of low and average prior attainment. The result was that Progress 8 scores for virtually all schools were lower than they appeared in August.
Second, as I predicted in July, many schools nationally have changed their curriculum so pupils are sitting more GCSE subjects in the EBacc3 “bucket”, especially pupils of low and average prior attainment. The purple lines in the graph (below), show this happened in practice for June 2016.
As a result, the national EBacc3 Attainment scores increased. Because Progress 8 is a relative measure, schools whose EBacc3 bucket was already full (and so their school EBacc3 Attainment scores remained the same) were likely to see a reduction in their Progress 8 score, even if their results and entry policy remained the same. This in turn had a similar effect on the overall scores.
There are likely to be national increases again in June 2017, probably smaller – and with the additional complication of the switch to 9-1 and numerical values for A*-G – but schools should be more familiar with the new measures by then.
Overall, while Progress 8 has undoubtedly encouraged schools to focus on all pupils in a range of subjects – rather than just those on the C/D borderline in English and maths – a number of policy issues still need to be addressed. For now, schools need to be aware of their potential impact on June 2016 results.
DfE banding scheme
Unfortunately, the DfE at a late stage chose to adopt a banding scheme for Progress 8 to appear on the performance tables “Compare” website, which makes inappropriate use of confidence intervals. School A can have a higher score than School B, and yet because it has a few more pupils be in a lower band!
As an average measure, Progress 8 is susceptible to the disproportionate impact of statistically impossible outliers. An example of this would be the able pupil who is in hospital at exam time, leading to a pupil Progress 8 score of, say, -7. In a school of 100 pupils, the impact on the school’s P8 score would be -0.07 – not an insignificant amount.
Subject and qualification neutrality
Ideally, pupils should be able to choose within each bucket the appropriate subjects and qualifications without affecting the Progress 8 outcome. This is not the case.
In short, while some adjustments need to be made, the change from a threshold performance measure (5A*-C) to one that looks at the performance of all pupils is to be welcomed as a principle for school accountability measures.
David Blow is head of the Ashcombe School in Surrey, and a member of ASCL data group. He will deliver a keynote speech at the ASCL Leadership of Data Spring conferences in February.