What if the reason a secondary school had a poor progress rate was not due to its own teaching, but because the pupils arrived with overinflated results from primary school?

Researchers at the number-crunching powerhouse Education Datalab believe this theory holds true, after they looked at secondary schools that took pupils from 30 different primary schools with suspiciously high key stage 2 SATs scores.

The calculations to figure this out are relatively simple. Taking one secondary with a low overall Progess 8 score, Datalab separated out the average progress score for pupils from each of its feeder schools. Pupils from most schools had the same rate of progress. But one school stood out: on average, each pupil that attended “Primary School H” received two grades lower at GCSE than expected given their key stage 2 scores, a pattern that repeated in 2015 and 2016.

When it comes to SATs manipulation, it appears the force only drives in one direction

And when they looked across the country they found more. In total, 30 primary schools had their pupils go on to achieve an average one grade lower on every GCSE, and not due to issues with the secondary schools, as Datalab controlled for that. It is simply that pupils in these 30 schools, for whatever reason, do extremely well in their SATs exams and then bomb at secondary.

As I see it, there are three plausible theories. One, Primary School H straight-up cheated: it hired in scribes, pointed answers, amended papers and so on. If this sounds outlandish then it’s worth remembering that a Teacher Tapp survey in November revealed that eight per cent of teachers said they had been asked to point out incorrect answers to children and 11 per cent had been asked to give a child undocumented additional support on reading or writing.

Theory two: the school didn’t cheat but it worked incredibly hard, week after week, drilling pupils until they could all pass their exams with flying colours. Unfortunately, in doing so, this left the pupils with little ability to learn anything else and when they got to secondary school they were overwhelmed with the range of subjects and independent learning and so fell apart.

Or, theory three: something about pupils in these schools made teachers at secondary school respond to them less well. Maybe the pupils were all of a particular race, and then went into racist secondary schools. Maybe their swottiness was disliked by teachers in their new school?

Honestly, this last theory is the weakest. It may cover the odd one or two among the 30, but I can’t see it holding for them all.

Which leaves us with cheating or overpreparing. And that’s awkward, not only because those SATs results might well have determined which set pupils were put into (around 60 per cent of schools use SAT scores for setting), but also because it shows how precarious the progress measure for any one school really is. One unethical feeder school and – pow! – you’re done.

On the other hand, I wondered if any primary schools out there had taken a hit on lowering their scores to unfairly help a secondary school have an advantage in its progress measures. Apparently not. The cleverbods at Datalab tested and found precisely zero primary schools where pupils went on to make an extra grade for each GCSE subject compared to expectations.

When it comes to SATs manipulation, it appears the force only drives in one direction.

Datalab’s report ends by noting that the Department for Education and Ofsted might wish to remove the results of pupils from anomalous primary schools before judging the progress measures of a secondary school. It’s an idea both will take into consideration.

But, more importantly, this must surely now trigger exam-day monitoring from the Standards and Testing Agency, and begin a series of other checks to ensure these schools, many of which will have been rated ‘outstanding’ a long time ago and left without further monitoring, are adequately preparing their pupils for the future.

Laura McInerney is contributing editor of Schools Week