I was invited to give evidence to the scrutiny committee for the second reading of the Education & Adoption Bill. I had a few things I wished to say – but wasn’t able to, given a truncated session and an exclusive focus (in my section) on the definition of a “coasting school”. The following expresses some of what I’d planned to say at the session (and have now supplied in written evidence).
Young people from disadvantaged backgrounds are over-represented in struggling schools, and I have challenged the existing focus on punitive “shaming” of these schools rather than support for them to improve. But – but – but! This bill risks confusing issues on a range of fronts.
Definition of ‘Coasting Schools’
As I said in my evidence, we need to be clear on what the problem is we wish to address with this concept of “coasting”, and how we then tackle it.
Previously, the concept of “coasting” was applied to schools not doing well, and “stuck” in a poor position without impetus for improvement. The shift in thinking can be illustrated by the focus and methodology of my report “(Un)Satisfactory”, which contributed to what was then termed the “coasting schools” policy agenda by demonstrating the unacceptable relationship between school quality and pupil disadvantage, basing its analysis on Ofsted grades and reports.
Analysis focused on those schools that had been “stuck” at satisfactory for two Ofsted inspections or more (16 per cent, at the time), and had been rated grade 3 or below for the judgement of “capacity to improve”.
It does seem very odd the indicator published yesterday floats free from Ofsted accountability and grades.
This risks confusion for schools and parents, both in terms of the agendas on which schools are being held to account, and in terms of outcomes (given it is currently possible that schools graded outstanding might also be identified as “coasting”). It also risks undermining Ofsted’s authority. If Ofsted’s own data-driven assessments are seen as flawed, surely these ought to be amended, rather than imposing separate accountability measures from a different source.
School improvement, and sponsorship
There does need to be more than one method for securing improvement – especially as the advocated method (academy sponsorship) is not yet well-evidenced.
In fact, to say academy sponsorship is not yet well-evidenced is not quite right: actually there is increasing evidence to show the impact is patchy at best. The Education Select Committee inquiry into Academies and Free Schools was clear the evidence is mixed.
And stripping out qualification equivalencies, and GCSE retakes, appears to have had an especially detrimental impact on results for sponsor academies and chains.
Work carried out with colleagues for the Sutton Trust showed a handful of chains achieving transformational improvement combined with relatively high attainment outcomes across a range of measures.
However, it also showed a similar number performing very poorly. We are currently completing analysis of the 2013-14 outcomes. However, our initial findings suggest this trend has been extended in the intervening year, including some of those chains identified as having low results and no improvement falling back further in the intervening period.
One way to explain the pattern is to say, rather than a typical pattern of reversion to the mean, academy chains seem to be showing increasing polarisation between the effective ones (which continue to improve for disadvantaged students) and the ineffective ones (which have got worse).
Going Forward
Hence struggling schools should be able to be supported by a range of suitable improvement agencies, including successful academy chains, maintained school federations, an outstanding local school partner where one exists, or successful LA provision. The criteria should be quality, capacity, track record, and strategic vision, rather than provider type.
This expansion of potential improvement agencies might help the Department for Education (DfE)/Regional Schools Commissioners in expanding the pool of potential providers – this is especially significant, because the evidence suggests they need to raise bar on commissioning, and also begin removing schools from sponsors more systematically and transparently.
On commissioning, evidence to the education select committee inquiry from the DfE showed extraordinarily high rates of sponsor approval: only 25 out of 704 applications to become a sponsor had been declined (3.6 per cent) as of November 2014.
Perhaps it is unsurprising, then, that far from all are thriving. We need tighter, transparent criteria for commissioning sponsorship. I suggest the criteria should be four-fold, as follows:
1. Quality (in terms of attainment, and offer to students)
2. Capacity
3. Strength of track record (against transparent criteria)
4. Clarity of strategic model and educational vision/strategy (including the governance model, school improvement strategy, regional coherence, envisaged rate of expansion etc).
Clearly for new sponsors unable to evidence a track record it would be especially important that other application elements are particularly robust and well scrutinised.
As greater numbers of sponsors enter the system, it will also be vital the mechanisms to remove failing sponsors also become more robust and systematically-applied.
Emerging evidence highlights the necessity of this to ensure school improvement, and evidently this will be especially vital if sponsorship is to be the main systemic vehicle for school improvement.
As was argued by the Academies Commission in 2013, the current seven-year contract (or funding agreement) for schools should be reduced to five years. Seven years is far too long for a school to remain in the hands of a sponsor that does not secure improvement.
In the US, charters (contracts) are typically for three to five years, and evidence to the select committee inquiry suggested these tight contracts, coupled with rigorous assessment and non-renewal where necessary, are key to success.
Finally, we do of course have evidence of some sponsor chains with exceptionally strong success. The NAO concluded that DfE did not yet know why some academy chains are more successful than others. It is imperative that steps are taken to ensure that we learn from the successful sponsors, and spread these lessons across the system.
The evidence against academy conversion being the ‘key’ to ‘educational excellence’ as a DfE spokesperson told the Independent is mounting. It’s not just the Education Select Committee but the National Audit Office which found informal methods such as support were more effective (and cheaper, of course) than formal interventions such as academy conversion. The Centre for Longitudinal Studies found ‘no evidence that government investment in particular school structures or types – for example, academies, free schools or faith schools – has been effective in improving the performance of pupils from poor backgrounds…’
Further evidence was published today. The LGA said only 3 of the 20 largest academy chains had above-average value added compared with 44 of 100 LAs. It also said ‘on average, pupils attending maintained schools achieved the same high standard of GCSE results in 2014 as those attending academies.’