The credibility of the baseline assessment chosen by more than 12,000 primary schools could test government plans to hold schools to account for pupil progress.
A letter sent to headteachers described a series of “anomalies” in the Early Excellence Baseline Assessment (EExBA), a model that involves observations rather than formal tests.
In the letter, Early Excellence director Liz Marsden said the provider was “acutely aware that there are some anomalies with aspects of the data set”. She said the glitches were mainly in literacy scores.
Speaking to Schools Week, Early Excellence national development manager Jan Dubiel said scores in literacy were quite low, “meaning the threshold of what needs to be ‘typical’ got lower and lower”.
In some cases, children expected to be scored as “below typical” based on the marks teachers gave in observations, received scores labelling them as “typical”.
Primary schools will be measured on the progress pupils make while in their care. If they are not accurately assessed during the baseline period this performance metric could be negatively affected.
Early Excellence has conducted an “in-depth analysis” and found that of the 47 statements used by teachers to assess four-year-olds, two in literacy were causing problems – one in the use of initial sounds in a word, the other on naming and sounding of letters.
Mr Dubiel said Early Excellence would provide “more extensive exemplification” and would adjust “some of the wording” in future statements to “make it even clearer of what the expectation of that outcome is”.
But he added that the changes would make this year’s data “slightly different” to next year. The Department for Education (DfE) will only take one “aggregate score” and, despite the changes to the literacy model, the overall scores will “still be comparable”.
A primary school headteacher using the EExBA, who wished to remain anonymous, said she now had a “useless piece of data that we won’t be able to use”.
She also said she was concerned about the moderation of the assessments which, if not completed correctly, could mean pupils of equal ability receiving different scores.
“We are relying on teacher assessment and that doesn’t appear to me to have a very thorough national moderation.
“Obviously you moderate within your school and in a cluster, but that doesn’t mean that is going to be the same as the cluster next door. The rigorousness, for something that is going to be used to hold us to account, is not good enough.”
Early Excellence has also been accused of “unprofessional” behaviour after their letter said they provided “the only non-invasive approach to baseline [assessments]”.
Katherine Bailey, director of applied research at the Centre for Evaluation and Monitoring at Durham University, said the language was “very emotive”. The Durham centre is also an approved provider and uses a 20-minute computer program to test pupils.
“I think it is unhelpful to use words such as ‘unethical’ or ‘invasive approaches’ when actually every single one of the suppliers has gone through a rigorous procurement and selection of quality process with the DfE.
“If there was anything unethical or inappropriately invasive in anything we’re doing, it would have been highlighted in that process.”
Mr Dubiel defended Early Excellence’s language.
“I would absolutely stand by it. EExBA is the only assessment that is non-invasive; it doesn’t prescribe any tasks or tests and it is done on practitioner knowledge and observation against the criteria and the exemplification statements.
“The other providers include long periods of time with children and adults sitting outside their classroom doing pre-set tasks and tests. We would say that is invasive in terms of practice whereas ours happens alongside and in conjunction with everyday practice.”
A spokesperson for the DfE said this was an issue for the provider.