I’ll come clean; I want to see more robust educational research in the hands of anyone involved in making informed decisions about practice and policy.
But there are undeniable difficulties. As this book makes abundantly clear, there is an awful lot of poorly designed, poorly evaluated research out there, and, as the authors put it, sometimes it can cause more harm than good: “ignoring such research is the most rational and safest thing to do”.
Anyone who has tried to navigate the waters of edu research – and that’s if you’ve managed to go beyond the paywalls – in order to inform their decision-making will know that, with scant research background, it is hard to judge the trustworthiness of what is published. The authors endeavour to present “innovative methods for the design, conduct, analysis and use of evidence from robust evaluations like educational trials”.
The stakes are high: a lot of money is spent trying to uncover what works to improve the education system and with it the future of the students in our trust. The authors argue that there is a lack of “strategic vision” to test out the things that are most likely to significantly improve outcomes for our students, for both academic and what they call “non-cognitive” outcomes.
The (bitter) irony of the title becomes clear
The (bitter) irony of the title becomes clear. With what sounds like seething anger (or is it sheer frustration?), the authors do not mince their words when denouncing the money and time wasted on poorly designed research projects, the misled “impact” evaluation of RCTs in particular, and poorly reported results.
I’ll come clean again – I’m no expert in statistics; I struggled through a couple of early chapters explaining at length that without proper randomisation, using significance testing in an impact evaluation is simply wrong, but even I could see that the method they advocate is more straightforward and reliable. Just don’t ask me to explain the p-value (though I got the hang of NNDT (number needed to disturb), I think…).
After deploring the current state of things and denouncing those who contrive to maintain the status quo – “much policy and practice today seems to be … evidence-resistant” – the book moves onto a much more constructive tone. The authors set out on a quest to uncover prior evidence for key lines of promising inquiries, then describe in detail the new research projects they conducted and evaluated (when they say “in-depth”, they’re not kidding).
Firstly they focus on catch-up projects around transition time from primary to secondary intended for pupils struggling with (mainly) reading, those likely to struggle to access the secondary curriculum and to fall further behind their peers.
Six interventions are tested. Suspense builds up: you read conscientiously through the existing evidence findings, continue to hold your breath through the description of the trials, their evaluations, resisting the urge to turn the page and go straight to the findings. Finally you are rewarded… with frankly disappointing results – some surprising, some underwhelming. I have only myself to blame; part of me wanted a neat happy ending. The truth, of course, is far more complex.
The team then repeats the process with the testing of promising or popular whole-school approaches, such as adopting a core knowledge curriculum at primary, philosophy for children, and even enhanced formative feedback, and finally, the testing of approaches that seeks to “educate the whole person”. The results are once again surprising but ultimately disappointing.
You have to admire the drive to conduct these projects in the most uncompromising way possible, the painstaking transparency, the advocacy for a trustworthy process. Each impact evaluation is complemented by a process evaluation, using the qualitative data gathered along the way, which is fascinating in its own right, uncovering many of the barriers which often make educational trials in school settings so thorny. What is uncovered along the way are things that definitely don’t work – and I suppose that’s something. Definite issues are uncovered which should – will? – be investigated further.
If you are interested in conducting research, this is a great book. If you’re looking for quick answers and starting points, wait for the factsheets.