Ofsted has again refused to release data showing how many people supported or opposed its report card inspection plans – saying the “vast majority” of the 6,500 responses were not actually categorised by “sentiment”.
Legal experts on public consultations have said the admission means there is a “risk” decision-makers at Ofsted were not “provided with the relevant information” to inform their inspection overhaul plans.
The watchdog’s use of AI to code responses is also now facing scrutiny over accusations it could be “unreliable”, with potential wider implications for the use of such technology in government consultations.
Ofsted refuses release (again)
Responding to the findings, Pepe Di’Iasio, general secretary of leaders’ union ASCL, said the process was “very flawed”.
“It is difficult to see how decision-makers could have had a granular understanding of the consultation responses,” he added.
Ofsted’s inspection consultation ran from February to April. It consisted of 102 open-ended questions about proposed changes to inspections, which have started in schools this week. Three-quarters of respondents were education professionals.
Ofsted only published a narrative summary of its consultation findings in September, despite saying it had “sentiment” data on responses.
Schools Week requested this data under the Freedom of Information Act. We asked for overall responses based on Ofsted’s sentiment categories of positive, negative, mixed and neutral. We also asked for this broken down per individual question.
‘Most responses not coded’
In its response, sent on Friday, Ofsted again refused to release the information. However, it provided new details on its analysis.
Ofsted used a “hybrid approach” to look at the “vast amount of qualitative data”. Human coders analysed a “small proportion” of the early responses. They did code each individual response “by theme and by sentiment”.
An AI model was then used to “analyse all responses” once the consultation closed. However, AI instead “produced several themes per question” and then “described each theme, rather than individual response, by sentiment”.
In its FOI response, Ofsted said: “The vast majority of responses were not directly coded by sentiment at the individual level.” It added that the hybrid approach “provided an overall understanding of sentiments and views”.

Ofsted also refused to release the sentiment data collected by the human coders because it was “not representative”.
“Releasing the sentiment data of early respondents in isolation would be misleading and potentially lead to incorrect conclusions being drawn by the public,” added Verena Braehler, Ofsted’s deputy director or research.
National Education Union general secretary Daniel Kebede said the response “only adds fuel to the fire”.
“Accurate analysis of all responses was the very least we should have expected and we can’t even guarantee that was the case,” he said. “The many teachers who took part should at least be able to have had faith in their responses being read.”
‘We weren’t counting boxes ticked’
However Ofsted said the theming method was “just as meaningful” as coding individual responses by sentiment.
The consultation “wasn’t a quantitative exercise and we weren’t counting boxes ticked – this was free text and people expressed a range of views and sentiments within and across their individual responses, which were all captured in our analysis and reflected in our response to consultation”.
When shown details of Ofsted’s FOI response, Sam Hart, an associate at law firm Sharpe Pritchard LLP, said: “There is a risk that the decision-makers within Ofsted [were not] provided with the relevant information.”
The ‘Gunning Principles’ are case law that lay out standards for how consultations should be conducted. The fourth says “conscientious consideration” must be given to responses before a decision is made.
Hart said Ofsted’s response must be “sufficiently robust to ensure that individual responses are being conscientiously considered”. He added there is “a risk that AI may overlook or deprioritise minority viewpoints even in circumstances”.
Andrea Squires, a partner at law firm Winckworth Sherwood, added there was a question over whether Ofsted “was acting reasonably” in its use of AI.
‘We use AI ethically’
Use of AI is becoming increasingly common in analysing government consultations. One tool used by government departments – Consult – states a recent evaluation found it accurately identified themes in 60 per cent of consultation responses.
Schools Week asked Ofsted for details of the accuracy of the AI model it used but it declined to provide this information.
A spokesperson said they “use AI lawfully, ethically and responsibly. We ensure there is meaningful human control and validation of AI processes, and appropriate human oversight of the use of AI.”
Surveys, focus groups, test visits and stakeholder meetings also informed their inspection decisions.
Leaders’ union NAHT is considering whether to appeal a refusal by the high court to pursue a judicial review over Ofsted’s new inspections.
General secretary Paul Whiteman said: “We have repeatedly called on Ofsted to fully publish all its analysis of the consultation responses. The fact that they won’t speaks volumes.”
Your thoughts