News

KPMG Foundation report left out negative findings

Exclusive

An education foundation left out negative findings from a study of its flagship literacy programme.

In December the KPMG Foundation, the charity arm of the global consultancy firm, released the findings of a major study into the Reading Recovery programme.

The report said pupils on the programme, which the charity’s website describes as its “flagship project”, were twice as likely to get five good GCSEs.

But the original report, put together by academics at the UCL Institute of Education and seen by Schools Week, showed findings relating to a comparison group were left out of the version released by KPMG.

The original report said the Reading Recovery pupils “did not significantly outperform” pupils who were from the same school, but not on the programme.

Stephen Gorard, an education research methodology expert at Durham University, said the boost could have been the effect of school intervention, rather than the literacy programme.

UCL and KPMG have said the charity decided to remove the group because they had higher literacy scores to start with and so didn’t meet academic requirements for a comparison group.

But Gorard said KPMG should have “published all the results and explained properly” rather than gloss over the omission.

Reading Recovery, an intensive one-to-one intervention, was delivered to schools through a £10 million project called Each Child a Reader, launched under Labour, from 2005-08. The KPMG Foundation helped to fund the programme.

The landmark study, said to be the first of its kind in the UK to follow the progress of pupils over ten years, followed 239 children who did the Reading Recovery programme in 2005.

Of the 222 that could be traced ten years later, 49 per cent of the Reading Recovery group got five or more good GCSEs, including English and maths – compared with 23 per cent of pupils at schools without the programme.

The KPMG report says “the positive effect of Reading Recovery on qualifications at age 16 is marked […] and suggests a sustained intervention effect”.

Jennifer Buckingham, a senior research fellow at the Centre for Independent Studies in Australia, who discovered the differences in the first report, said there was “no explanation for the omission”.

Jane Hurry, one of the UCL researchers, told Schools Week she was required to remove the comparison group of pupils to meet the requirements of an academic journal.

It said that the different groups of pupils in the study must be matched at baseline on key indicators, which meant the control group with higher literacy scores had to be removed.

Hurry said the foundation made a “last-minute decision” to remove the group and produce a summary “suitable for a lay audience”.

The original UCL report will be published by the university after it has been peer reviewed, she said.

Stuart Kime, from Evidence-Based Education, said: “Communicating the results of studies with control groups can be challenging, so it may be that a decision was taken for the ease of comms.

“Nevertheless, the rationale is irrelevant when it’s put in the context of selective publication or publication bias. From a scientific perspective, publishing everything is the most defensible thing to do.”

A KPMG spokesperson confirmed a decision was made to “remove the group from the report, which was approved by the academic experts” at UCL.

The charity was “assured by UCL that standard academic procedures” had been followed. The company’s support for the study had now finished.

Your thoughts

Leave a Reply to Janet Downs Cancel reply

Your email address will not be published. Required fields are marked *

6 Comments

  1. Tom Burkard

    KPMG has form in respect to Reading Recovery. In 2009, Policy Exchange published our report– https://www.policyexchange.org.uk/wp-content/uploads/2016/09/every-child-a-reader-feb-09.pdf –which found that the only evidence for RR’s efficacy came from their own evaluations. Perhaps just as worrying were their costings; we discovered that each ‘successful’ intervention cost £6,625 per pupil, and not £2,400 as they had claimed. KPMG conveniently omitted the cost of training their tutors and the administrative costs of the programme. To justify this expense–which was vastly greater than other proven interventions–KPMG calculated that by the age of 37, each illiterate pupil will have cost the taxpayer an additional £42,000. By contrast, an earlier American study estimated that Reading Recovery returned thirty cents on the dollar, at best.

    • ‘…whether Reading Recovery works is not so much the question, as whether it is right and proper for the Government to tell schools which remedial programmes should be used.’ (Extract from Policy Exchange document cited above)
      Spot on. This includes considering whether it’s right and proper for the present government to tell schools which reading methods should be used or which maths systems are superior.

    • ‘…most of the reforms initiated by Kenneth Baker in 1988 and expanded under New Labour are based upon a topdown, coercive style of micro-management. ‘ (From the PX paper cited above).
      And still it continues as ministers, Nick Gibb in particular, tell teachers how to teach.

  2. Stephen Gorard

    Interestingly, I think the safest analysis they could have presented would have suggested impact from the intervention anyway! The missing comparator group in the House of Lords version was those pupils in RR schools who were not selected for the intervention (largely because they were better readers already). Leaving them out means that the often weaker RR pupils were compared to all pupils in the non-intervention schools (because we do not know which would have received RR if they had been intervention schools – there is no simple criterion like below a certain level). So the fairest comparison would be all pupils in both types of schools (including missing group with the intervention). This would tend to dampen the effect size but would at least give an idea of whether RR was working. On the evidence from this small trial it was.

    • Jane Hurry

      Thank you Stephen, I had come to a similar conclusion and have now run an analysis with the bottom 4 children at baseline in each school, in both comparison and RR schools (so not all of the children in the RR schools would have received the intervention). As you predicted, the RR school children did significantly better than the comparison school children but with a somewhat reduced effect size.