Last year, on GCSE results day, panicked phone calls came through to the Schools Week office. There were dips – big dips – in the results of schools with outstanding records and well-respected leaders. What was going on?

Many callers said the exams regulator, Ofqual, must have moved the boundaries of English and maths GCSEs, echoing what happened in 2012 when English GCSE results dropped dramatically. Others asked if there was an issue with iGCSEs. Or was is it just the law of averages catching up with over-achievers?

For the past three years, Ofqual has published a “volatility” chart showing how many schools had large drops or gains in their results.

The graphs showed a normal pattern: a small shift for most schools of a few percentage points, fewer schools moving more than 10 percentage points, only a few hundred having substantial drops or gains.

So why the concern?

Demographics or subjects entered don’t appear to make a difference

Big drops in maths and English tend to freak schools out because their overall pass rate is affected. If music GCSE takes a battering it doesn’t affect the headline measure. Headteachers who saw their maths or English plummet were therefore in shock and wanting answers, even though the overall dip rates looked normal, no more or less than the previous year.

“It’s because we have more free school meals kids in our school,” some said. Others worried that having English as an additional language pupils, or doing the EBacc, had ruined their rate.

Schools Week began requesting information from Ofqual to see who was correct. First, we asked for (and received) grade boundary reports flagging how and why decisions were made to change them. Second, we asked for the data behind the volatility charts. We wanted to see if schools with big drops had anything in common. Ofqual, equally concerned to see if certain schools were being treated harshly, worked with us to answer this question. The report published today is the fruit of that labour.

What the report shows is that only one factor correlates significantly with volatility in results: the more a school shifts its exam entries, entering or removing large numbers of pupils, the more likely large increases or decreases in results. Other factors – pupil demographics or subjects entered – don’t appear to make a difference.

First lesson, then: the more you change exam entries, the bigger the risk your results will change – positively or negatively.

An alarming sounding part of the report is that schools with more C-grade pupils have more volatile pass rates. But this is not because Ofqual are harsher on those schools. Instead, it’s a quirk of the accountability measures decided by government.

If a school has lots of pupils working around the D/C borderline, then a one-mark change in the grade boundary will affect lots of pupils and can cause a big dip, or rise, in five A*-Cs.

Schools with lots of A-grade pupils, notably the independent sector, also face volatility. But theirs is in the A/A* range. Having lots of pupils working around the A-grade range makes them sensitive to boundary changes of higher grades. Big dips in their A grades, however, don’t grab the headlines so much because that score isn’t routinely published.

Lesson two: All types of schools have volatile results it’s just that certain dips (in C grades) affect certain schools (those with lots of C-grade pupils) and those matter more because the government focuses on them.

Headteachers calling us last year were therefore seeing the combined impact of three things. One, volatility caused by moving pupils between exams. Two, genuine decreases in pupil ability. Three, having lots of pupils whose ability straddles the government’s main target. The first two are things heads can change; the last one, not so much.

Thankfully, performance measures are changing. As schools shift to Progress and Attainment 8 this pattern will change. How? That’s not yet clear.

In the meantime, all school leaders should know that if lots of pupils switch exam entries, or lots of pupils work at the borderline of performance measures, they are more likely to see great leaps in results, but also reductions. So, do not despair if results dip, but also – perhaps – do not congratulate yourself too much on gains either.