To reduce the message of this book to its simplest form: PISA tests are carefully designed to be as reliable as possible within real-world constraints but journalists and politicians (yes, those equally despicable creatures) often use them to make invalid claims.

This deceptively small book sets out to explain what these constraints are, how the OECD attempts to account for them and, ultimately, what policy makers can and cannot reasonably infer from PISA results. The authors sum up the problem succinctly: “Accurate test data on its own cannot improve school systems; it is how it is interpreted and used that is crucial.”

As someone who works in the business (and not always at such self-respecting publications as my present employer), the authors’ claim about the media reads as a sad platitude, that while PISA is the state of the art in educational measurement, the media (they are justifiably careful in using this word rather than the more specific “journalists”) in their quest for punchy headlines, focus on a tiny aspect of what PISA actually is.

You guessed it: the rankings.

But the rankings are not always accurate, we are told. OECD documentation makes a point of explaining that it’s not possible to assign exact positions to countries based on their average scores in reading, maths and science. So for example, when the Netherlands ranked 10th for maths in 2012, their score was not significantly different from that of Vietnam, in 17th place. The rankings are “estimates” and they are not really accurate unless we also know “the size of the errors associated with the rank”.

This book aims not to discredit PISA testing, but to sort the wheat from the chaff

But here’s the rub. In order to maintain their own significance in the public consciousness, the “pragmatic” OECD, posit the authors, recognises that the media will want league tables. They thus manage to convey the sense of an organisation balancing on a knife edge between distortion and insignificance. But which is the greater evil?

Interestingly, media headlines tend to bear out the authors’ framing of PISA as producing “catalyst data” – creating a sense of crisis that can engender change. But the question is, are politicians drawing the right conclusions and thus, making the right changes? How many people know, for example, that PISA is not even designed to test how well students have learned what is taught in schools, but how well they can “apply knowledge and skills to solve problems based on real-life situations”?

After Finland bagged the top spot for literacy in 2000, its edu-tourism industry boomed. Shanghai’s position at the top of the table in 2009 led to a flurry of UK government attention to work out what it was doing that was so effective, even though comparing a single Chinese province to the whole of the UK is like comparing “apples and oranges”, the authors remind us – which leads to a whole other discussion of political pressure and buy-in.

This book aims not to discredit PISA testing, but to sort the wheat from the chaff and tell us which data should actually form the basis for policy decisions.

For that handful of academics already familiar with the technical aspects of PISA testing, this is possibly not the book for you. For everyone else working in schools, education policy or research – and especially if your knowledge of PISA is limited to the stories you read in the mainstream press – then this is a must-read – and a quick, accessible one.

It by no means denigrates PISA testing, just encourages informed scepticism of the attention-grabbing front pages and policy decisions that flow from it. I cannot possibly communicate the book’s subtleties here – so please, don’t rely on the titbits, read the whole thing. If we all do, the country will be so much better prepared to challenge the next round of ridiculous policy suggestions based on flawed inferences.

By the end, the authors have given just enough for an intelligent debate on the subject. In fact, they even end with seven points for discussion, should you ever get stuck in a lift with Justine Greening.