Every primary school publishes a pupil premium strategy. Inspectors read them. Governors scrutinise them. Leaders devote hours to drafting and updating them.
But are these documents genuinely helping schools make better decisions for disadvantaged pupils, or are they a compliance exercise?
A regional review of more than 550 primary school pupil premium statements across the north east suggests the answer is not straightforward.
The positive finding is that schools are actively engaging with evidence when designing their strategies. Less encouraging, however, is that some of the approaches with the strongest evidence for improving attainment and narrowing disadvantage gaps remain underused.
The review, funded by the North East Combined Authority and conducted by WhatWorked Education, examined every published primary pupil premium strategy across the region for 2024-25.
At first glance, the findings are encouraging.
Schools ‘not ignoring evidence base’
Ninety-four per cent of statements referenced the Education Endowment Foundation’s teaching and learning toolkit.
Very few schools were spending on approaches with weak or unclear evidence. This matters, because it suggests school leaders are not ignoring the evidence base.
But across the region, schools typically referred to just two of the EEF’s five highest-impact, low-cost strategies.
Oral language and reading comprehension approaches featured strongly, reflecting the centrality of language development.
But feedback appeared in fewer than half of strategies. Metacognition and self-regulation featured in just over a quarter. Peer tutoring appeared in only two per cent of statements.
These are well-established approaches with extensive supporting evidence. Their absence raises an important question for school leaders: why do some of the most effective strategies struggle to move from research summaries into classroom practice?
Approaches such as metacognition and peer tutoring are not “bolt-on” interventions. They require careful design, staff training and sustained attention to implementation.
Toolkit not meaningfully engaged with
In busy schools under constant pressure, leaders may understandably gravitate towards approaches that feel familiar or are easier to explain to multiple audiences.
Another explanation lies in how pupil premium statements themselves are being used. In many weaker examples, the toolkit was referenced but not meaningfully engaged with.
Strategies were listed, but without a clear explanation of why they were chosen, how they would be implemented or how leaders would know whether they were working.
In these cases, the document became descriptive rather than strategic, a catalogue of activity rather than a plan for improvement.
The strongest statements looked very different. They began with careful diagnosis, often drawing on standardised assessment data in English and maths alongside attendance and pastoral information.
They selected a small number of approaches tightly aligned to identified needs. Crucially, they explained what staff would do differently in classrooms, how implementation would be supported, and when leaders would review impact and make any necessary adaptations.
This distinction matters because pupil premium strategies now sit at the intersection of accountability, improvement and inclusion.
System-level lessons
Ofsted inspectors use them to understand how leaders think about disadvantage. Governors rely on them to hold schools to account. Staff look to them for clarity on priorities. When strategies are vague or generic, they fail all three audiences.
There are also lessons at system level.
Patterns of over and under-use can tell us where professional development is needed. Gaps between evidence and practice can highlight where implementation support matters more than new guidance.
The question, then, is not whether schools are using the toolkit. It is whether they are being supported to use evidence well.
Which high-impact approaches are hardest to implement in real classrooms? Where do leaders need practical support to translate research evidence into classroom routines? How can schools effectively review impact and adapt?
The lesson from this analysis is not that schools are getting pupil premium “wrong”. It is that evidence use is no longer the main challenge. Implementation is.
If pupil premium funding is to deliver on its promise, strategies must move beyond referencing evidence towards embedding it through sharper diagnosis, clearer planning and stronger review.
Publishing a strategy is easy. Making it do real work for disadvantaged pupils is much harder.
Your thoughts