What we can, and cannot tell, from A-Level and GCSE results this year

What we can, and cannot tell, from A-Level and GCSE results this year

A tricky part of reporting A-level and GCSE results this year is that school leaders will want different information than we, in the media, will have available. Editor Laura McInerney looks at what we will be able to say from the data.

Previously, for GCSE, schools were focused on how many of their pupils passed subjects at a C grade or above. By knowing pass rates for the whole country, a school leader or department head could roughly work out if they were above or below average, and judge based on prior scores whether things were going in a roughly positive or negative direction.

For A-levels, no one bothered too much at all about headline pass rates on results day. Attention was on individual students getting scores for universities or job offers. In the days after, department heads would typically work out how well their department had done against national pass rates, or against pupil prior achievement, and write reports only for internal use.

This year, things will be different. Schools have new headline measures at both GCSE and for sixth forms. Progress rates will matter. So will attainment across a basket of subjects. In sixth forms, retention and drop-out will matter too.

Unfortunately, this data will be very difficult to extrapolate from the blunt data we will have.

On results days the media (and the public at large) only get given national pass rates per subject. This isn’t going to change.

The data means we can see the overall percentage of pupils who passed their English GCSE. And the percentage that passed maths. And the percentage passing science. What we cannot do is see how many pupils passed both (or all) those subjects because we only get subject-level data not pupil-level or school-level data.

We could make a crude guess about pass rates. Companies who sell attainment information to schools are already poised to make their best guesses of what the national progress rates are. But this is worked out using historical data and we have no idea how accurate it is. The numbers are probably not going to be miles off. They might even be bang on. But they are very much guesstimates and we won’t know until schools get official progress data later in the year as to how correct they are.

So, what can we know on results day?

On results day we can see the percentage of pupils getting each GCSE or A-level grade (that is, how many As, Bs and Cs) – and we get a crude breakdown of demographics, e.g. boys vs girls, or independent vs state schools. From this we can extrapolate some cool things. Did more girls get As this year in science than last year? Or, are more boys doing a subject than before?

But we cannot work out progress rates (because we don’t have those yet) or combination measures (because we only have cohort scores, not those for individuals).

Not having knowledge of these headline measures tomorrow will be massively annoying for school leaders trying to work out how well they did nationally. We know that schools are going to be judged as coasting based on these scores, and they will have huge consequences.

Unfortunately, you will have to wait.

There is one upside too. As schools are judged by a series of measures, rather than just the 5A*-C grade pass rate, and as those results get harder to work out (which they will in upcoming years due to more changes) it means the focus of results day will go back to where it should be. The hard work of pupils and their teachers. All we will be able to tell is that individuals got the grades they needed for the next step, or they didn’t. And concentration can go on ensuring everyone is going to an appropriate course, job, college, university, whatever, that will best help them learn the next things they need to learn.

It makes for much less exciting reporting. It might make for a calmer day, though.