Opinion

Don’t ditch internal data in inspections

1 Apr 2019, 16:52

School leaders will surely welcome Ofsted’s stated intention to ‘dial down’ the influence that data has on inspection outcomes and the consequent reduction that this will have on workload, but as ever, the devil is in the detail, says Ian Hartwright

The draft inspection handbook goes much further than ‘dialling down’.  From September Ofsted proposes that inspectors will not consider schools’ internal assessment data at all.  Instead, only statutory data contained in Ofsted’s Inspection Data Summary Report (IDSR) will be admissible.

Ofsted’s position is that because inspectors cannot verify such data, it is not reliable or valid.  So once again the starting point for inspection is that school leaders and teachers are not to be trusted.  If we are to make progress towards a truly school-led system, where the lateral accountability associated with other high performing jurisdictions is to become the norm, we urgently need to move away from a deficit model.

It’s certainly true that recent inspection practice has ‘skewed’ the data in question. In fact, this potential exists for everything that inspectors evaluate. For example, inspectors’ consideration of lesson planning ‘skewed’ lesson planning itself – so much so that inspection practice was changed.  Similarly, inspectors’ evaluation and recommendations on the marking of pupils work undeniably skewed marking practice, again prompting a change in inspection practice.

We know that high-stakes, single word judgements drive perverse incentives and unintended consequences.

But a blanket refusal to consider any internal monitoring data of itself risks driving an increase in the priority and focus on external data, to ensure that the school clears the first hurdle of an inspector’s preparatory work.

Adopting comparative performance data would be a bigger step forward

It’s also a missed opportunity.  An oft-levelled criticism is that inspection contributes little in the way of real insight to schools.  In many cases a school’s prudent monitoring has benefitted pupils’ achievement; but, of course, examples also exist of cumbersome, unreliable and bureaucratic systems that have become ends in themselves, contributing little of value.  So, inspection could play an important role in evaluating the efficacy of schools’ monitoring systems and by sharing good practice.

And then there are the cases where external data is a problematic, unreliable or invalid measure of school performance.  In small schools; middle and junior schools; schools with high pupil mobility; schools working in challenging circumstances; and schools on rapid improvement journeys, in-year internal monitoring analyses can help to bridge the incomplete picture provided by national data.

Ruling out consideration of internal data is a blunt tool that could make it harder to recognise and acknowledge recent improvement, especially for schools in challenging circumstances where the time lag in statutory data is a factor.

A huge amount will, instead, rest on book scrutiny and pupil interviews.  But is there any reason to suppose that these samples will provide a more robust and reliable view of outcomes, particularly given that inspectors will be under pressure to collect and triangulate substantially more evidence? This risks incentivising schools to give undue attention to book scrutiny, sparking another industry that could all to easily develop the prominence that data, marking or lesson planning previously had.

Declaring year zero is impossible – the high stakes remain despite this small evolution, meaning that defensive behaviours to ensure inspection readiness will continue to appear to be rational behaviour.

A more fundamental reset is required, where the inspectorate focuses its capacity on delivering a deep-dive diagnostic on those schools that struggle to improve.  The remainder of the system requires a light touch to check that: pupils are safe, the curriculum is broad, and standards aren’t slipping.  Adopting comparative performance data (based on a three-year average) would be a bigger, more positive step forward.

 

 

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *