The effectiveness of teacher peer observations is in doubt, after a £1.2 million study at 82 schools found they made no difference to exam results.
The findings, published by the Education Endowment foundation, are based on the English and maths GCSE results of more than 14,000 secondary pupils who took part in trials.
No evidence was found of grade improvements when teachers took more frequent part in peer-to-peer observations – which in some cases took place monthly.
The research has prompted calls for a rethink of how observation is used in schools.
Schools shouldn’t expect to see an improvement in results by increasing the frequency and intensity of their teacher observations
For the study, a random group of teachers were trained in observation skills by the University of Bristol, and then observed and were observed by their colleagues several times over the course of two years.
Their pupils’ results were compared with those of a control group whose teachers had not used this format of peer-to-peer observation.
Each of the schools had high proportions of pupils eligible for free school meals, and many were using lesson observations of some kind before the trial began.
However, the findings led the researchers to conclude the “structured observation programme” – which trained staff and increased the number of observations — did not have any benefits “over existing levels of peer observation” already occurring within schools.
During the study, many teachers failed to complete the recommended number of observations – between 12 and 24 – as they had difficulty fitting them into their timetable. Others said they felt “uncomfortable” taking time out of teaching to complete them.
But even when teachers did complete the full observation schedule, there was no evidence that they had better pupil results.
The EEF trial, prompted by an earlier US study that found structured lesson observation led to gains in student and teacher performance, raises questions about the time and money schools spend on it.
The estimated cost to EEF of the observations was around £4,000 per school, per year. This does not include teacher time, which was not reimbursed and was undertaken as part of usual teaching duties. Most of the money was spent on software and iPads to record observations and training. In total, the EEF spent £1.18 million on the study.
Dame Alison Peacock, the chief executive of the Chartered College of Teaching, which this week signed a pledge to support evidence-based practice, said the difference between the EEF findings and the earlier work in the US “emphasise the importance of carrying out robust evaluation, in context, before simply jumping on a bandwagon”.
She pointed out that other academics, such as Professor Rob Coe, have previously shown that peer observations focused on rating individual aspects of a lesson did not change pupil outcomes, and have suggested that it is hard to judge teaching quality in this manner.
Given that observations “can be hugely time consuming, and given the serious workload challenges already facing teachers, there is an urgent need for further evaluation of other kinds of lesson observation to understand where, when and whether they may be effective in developing practice”, she said.
The study’s findings contradict received wisdom that peer observation among teachers leads to improvement. In recent years, other academic studies in the UK have espoused its virtues, and the EEF trial found it was fairly widespread in schools.
A compendium put together in 2009 by academics at Leeds Metropolitan University lists the benefits of peer review, including opportunities for teachers to reflect on and review their teaching skills and “learn new tricks from one another”.
The Learning Institute at the University of Leicester also speaks of the “opportunity to mutually enhance the quality of their teaching practice” that is provided by the peer observation process.
Sir Kevan Collins, the chief executive of the EEF, said the latest research found that schools “shouldn’t expect to see an improvement in results” by increasing the frequency and intensity of their teacher observations.
The EEF toolkit https://educationendowmentfoundation.org.uk/evidence-summaries/teaching-learning-toolkit gives lots of examples of interventions that improve learning by so many months. Where they have themselves tested the validity of these results with larger trials, they have found in all cases that there was zero effect of the intervention.
So, how many of their dozens of claims for improved learning are actually valid. On the basis of their own results so far, zero.
We are now creating the Chartered College of Teaching which says it will only use evidence to create policy. Great sound bite, but so far it looks like most of the educational “research” shows there are no magic bullets for improving learning.
Let’s just concentrate on developing great teachers like craftsmen, who take time to become masters of their field, rather than continuing this endless search for a cheap management fix.