Summer exams will come too late to start learning useful lessons from Covid but we don’t have to wait that long, write Natalie Perera and John Moore

The Covid-19 pandemic is arguably the biggest post-war challenge this country has faced. While the road to full recovery still seems a long way off, schools up and down the country have been working tirelessly to support pupils.

But steep challenges remain – most children have already lost almost five months of formal schooling and the risk of further lockdowns and disruption still looms large. Disadvantaged children, who were already over a year and a half behind their peers by the end of secondary, are likely to have suffered the most. Evidence from the Institute for Fiscal Studies finds that they were less likely to have access to digital resources and online teaching during the lockdown. Combined with existing evidence about the impact of school closures, the Education Endowment Foundation suggests that the disadvantage gap could widen by over a third this year.

We cannot afford to wait until pupils sit formal exams next year to truly understand the impact of lost learning. That is why the Education Policy Institute and Renaissance Learning – the providers of Star Assessments – will be working to analyse data in as close to real time as possible.

The project will analyse large scale data gained from Star Assessments across the country – sat by over one million children every year, with tens if not hundreds of thousands in each year group – to understand better what it tells us about pupil attainment this year. By comparing attainment among different groups of pupils to that of their peers at a similar stage in previous years, we will be able to get a picture of Covid-specific disruption, and account for any trends in performance that are not related to disruption caused by the pandemic

We won’t need to provide additional tests to schools

We will look at data at three points in the year – in autumn, spring and summer – to measure the progress that pupils make this year and how that compares with previous cohorts. And we will supplement this analysis with a survey of Star customer schools to learn more about how they responded to Covid. By acquiring information about approaches schools are taking – and how they are supporting pupils to catch up – we will be building the evidence base in this crucial policy area and learn vital lessons for potential repeated disruption, both from this pandemic and from any other instances which result in an interruption to learning.

The advantage of our approach is that we won’t need to provide additional tests to schools, and nor will we cause any extra workload for pupils and staff. We will also be able to use the longitudinal data to establish just what is happening, and what is just rumour.

The research will answer questions such as: have particular groups of pupils been disproportionately affected? Are there disparities across different year groups? To what extent are pupils making up for lost time in education? Have regional differences emerged, or differences based on the types of schools which pupils attend? Where future regional lockdowns may take place, what is the impact and what previous interventions seem to have the most impact? And importantly, what might be the long-term implications of this time out of school and can we see signs of recovery?

We plan to produce results early next academic year – long enough to have seen the impact over a whole year, but soon enough that we can work with the department and school leaders across the country to take the action that is needed.

 

Renaissance Learning UK and the Education Policy Institute are working together on a DfE-funded project exploring whether and how learning loss due to Covid in English schools is recovered over the academic year