Progress 8 is fairer, but some flaws need fixing

David Blow explains why your Progress 8 scores may have been lower than you had predicted.

The move to Progress 8 has undoubtedly been fairer than the previous measure of five A*-C GCSEs, especially for schools with lower prior (key stage 2) attainment, who were likely to have lower outcome attainment (at key stage 4).

In terms of the government’s “floor” standard, about 150-200 schools that were previously below the floor (of 40 per cent achieving five A*-Cs including English and maths) benefited directly, as Progress 8 revealed the good work they are undertaking.

However, many schools were surprised to find their published scores lower than they had predicted, and certainly lower than the calculations they made in August, based on their actual results. The reasons are two-fold.

READ MORE: Why we need to ditch Progress 8

First, many schools announced their “Progress 8 score” in August from their MIS system, without being fully aware that the calculations were based on 2015 national Attainment 8 averages. The figures published in October by the Department for Education (DfE) were based on the 2016 national Attainment 8 averages, which were higher than 2015, especially for pupils of low and average prior attainment. The result was that Progress 8 scores for virtually all schools were lower than they appeared in August.

Second, as I predicted in July, many schools nationally have changed their curriculum so pupils are sitting more GCSE subjects in the EBacc3 “bucket”, especially pupils of low and average prior attainment. The purple lines in the graph (below), show this happened in practice for June 2016.


As a result, the national EBacc3 Attainment scores increased. Because Progress 8 is a relative measure, schools whose EBacc3 bucket was already full (and so their school EBacc3 Attainment scores remained the same) were likely to see a reduction in their Progress 8 score, even if their results and entry policy remained the same. This in turn had a similar effect on the overall scores.

There are likely to be national increases again in June 2017, probably smaller – and with the additional complication of the switch to 9-1 and numerical values for A*-G – but schools should be more familiar with the new measures by then.

Overall, while Progress 8 has undoubtedly encouraged schools to focus on all pupils in a range of subjects – rather than just those on the C/D borderline in English and maths – a number of policy issues still need to be addressed. For now, schools need to be aware of their potential impact on June 2016 results.

DfE banding scheme

Unfortunately, the DfE at a late stage chose to adopt a banding scheme for Progress 8 to appear on the performance tables “Compare” website, which makes inappropriate use of confidence intervals. School A can have a higher score than School B, and yet because it has a few more pupils be in a lower band!


As an average measure, Progress 8 is susceptible to the disproportionate impact of statistically impossible outliers. An example of this would be the able pupil who is in hospital at exam time, leading to a pupil Progress 8 score of, say, -7. In a school of 100 pupils, the impact on the school’s P8 score would be -0.07 – not an insignificant amount.

Subject and qualification neutrality

Ideally, pupils should be able to choose within each bucket the appropriate subjects and qualifications without affecting the Progress 8 outcome. This is not the case.

In short, while some adjustments need to be made, the change from a threshold performance measure (5A*-C) to one that looks at the performance of all pupils is to be welcomed as a principle for school accountability measures.


David Blow is head of the Ashcombe School in Surrey, and a member of ASCL data group. He will deliver a keynote speech at the ASCL Leadership of Data Spring conferences in February.

Your thoughts

Leave a Reply

Your email address will not be published.


  1. We now have a system where a school cannot know how well it is doing on the official measures until months after the students have taken their exams. If your students are doing the maximum then year on year your performance will fall as others find ways to catch up. And all the time these assessments are measuring how well students can jump through exam hoops! Is that what school is for? You need to be a statistics genius to understand this system. All for what purpose? Are our students actually more educated than before? Do teachers want this? Do employers want this? Is the assessment system the tail wagging the dog?

    Is it time to look at what education is for, both for students and society?

    • Absolutely right. As usual David has given a superbly clear explanation. Sadly however what should have been a useful accountability measure that would have been superior to threshold ones has been turned into something totally confusing and meaningless for professionals. It is a distraction that tells us little or nothing about the learning outcomes pupils have achieved. All the evidence I see is that employers know nothing about this and are going to have real difficulty making head or tail of CVs of their potential employees for years to come. .