News of the death of phonics has been greatly exaggerated this week, sparked by an evaluation by the Education Endowment Foundation (EEF) of two programmes developed by Ruth Miskin Training, and currently in use in some 8,000 schools and 62 multi-academy trusts.
Read Write Inc. (RWI) and Fresh Start (FS) are systematic synthetic phonics (SSP) programmes, and initial sign-on alone can cost a school around £2,000, so the evaluation (at a cost of £1 million) was high-stakes. But SSP endorsement by the DfE (and former schools minister, Nick Gibb especially) means the stakes were even higher. In many teachers’ and leaders’ minds, RWI is essentially synonymous with SSP. Indeed, Ruth Miskin’s promotional material cites Gibb’s endorsement of SSP as evidence for its own approach.
RWI has been going since 2002 and is aimed at “children from reception to Year 4, and children with SEND in older year groups”. FS is for those in years 5 and 6 who “have slipped through their primary school’s reading net”, “have missed schooling” or “are new to the UK education system or whose first language is not English”.
At first glance, the randomised control trial conducted with over 7,000 pupils across 131 schools isn’t good news for Ruth Miskin. The details have by now been well dissected, but what seems to have been missed by those keen to dismiss RWI, FS – and SSP altogether, for some – is that EEF have given their evaluation a low-to-moderate (for RWI) and a moderate (for FS) security rating.
Understanding that may be a more fruitful line of inquiry, not about phonics but a faltering evidence-led profession.
Mistrial
EEF says 20 per cent of schools in the RWI control group had already received RWI training or purchased RWI materials. Skewing results further, 15 per cent of schools in the intervention group didn’t deliver the programme at all. That’s enough on each side to give a very different result, but researchers can’t draw that conclusion.
In the FS evaluation, more than one-third of schools in the intervention group (35 per cent) didn’t deliver the programme, nearly another third (29 per cent) delivered FS only to some eligible pupils, and 12 per cent didn’t provide enough data. That adds up to over three-quarters (76 per cent) of the intervention schools.
It’s a wonder the EEF even published the evaluation. But how could they not?
Publish and be damned
Stoking the backlash is the fact that the trial was conducted between 2016 and 2018. Rather than publish as planned, in 2019 the evaluation was wrapped up with the ‘Teaching and Leadership Innovation Fund’ evaluation. But Covid delays meant that wasn’t published until last week.
It isn’t the only time (even just this month) that the EEF’s publication schedule has been called into question. Indeed, they are developing new ‘nimble RCTs’ to respond to the need for speed. But publishing evidence for a pressured profession is sensitive, not least when wading into the ‘reading wars’.
Too rushed, and you face the kind of criticism levelled at Carol Dweck. Too slow, and the results risk being of no use to a profession still led by the changing winds of political priorities. Either way, one side or another of a divided profession will attack the messenger.
Evidence-engaged
But there’s a bigger problem still: calling for an evidence-led profession won’t work if it isn’t also evidence-engaged. Blaming Queen’s University Belfast for recruitment woes (as the Ruth Miskin statement does) is short-sighted, even if correct. Why weren’t schools more willing or able to sign up? Why did so many fail to account for their prior engagement with RWI/FS at the outset? And why did so many who signed on not implement the strategies? It’s impossible to expect reliable evidence from an unreliable dataset.
Underpinning all of this is still very strong evidence in favour of SSP. And the over-reliance on it identified by UCL is an equally valid consideration. Sadly, school leaders and teachers find themselves once more in the middle of a tug of war they have no time to make an informed decision about. They wouldn’t be wrong to disengage from the evidence, make a choice and stick to it until instructed otherwise.
Meanwhile, few are talking about how to bridge the gap between researchers and schools to make the latter a more reliable source of data. And until that happens, nothing in the evidence base is secure.
Unless, of course, it comes from somewhere that has more consistently championed teaching as a profession.
Your thoughts