News

Pearson apologises after BTEC gaffe hits schools’ progress 8 scores

Exams cancelled
Exclusive

Ministers have vowed to amend schools’ progress 8 scores after it was discovered some were as much as 0.2 points lower than expected because Pearson sent incomplete BTEC results to the government.

The exam board has apologised after some schools checked Department for Education data on Thursday and were shocked to see that their progress 8 scores were much lower than anticipated.

Pearson says the error has been corrected at its end, and the DfE says its data will be amended before it is released to the public later this year.

“We apologise to customers who have been checking their Progress 8 data sets,” a Pearson spokesperson said.

“As part of this process, some schools have highlighted that data sets were missing within a specific date range. We have followed up on this and corrected the issue with the data provider. We’d like to thank those that have contacted us for drawing this to our attention.”

It is the second gaffe relating to BTECs to hit the exam board this year.

In August, Pearson was forced to apologise after it hiked grade boundaries for its BTEC Tech Awards just days before pupils were due to collect their results, meaning youngsters were handed lower grades than they were expecting.

The latest error was discovered by data managers and heads when they began the annual performance tables checking exercise, which allows schools to log on to a government website to see their national data before it is released to the public. The data went live yesterday morning.

Peter Atherton, data manager at a school in Wakefield, told Schools Week some schools had received a “nasty surprise” when they went to check the website.

“It could be the case that, if all of these qualifications were missing for your school, that could affect your progress 8 score by quite a lot. Some schools are saying they’re -0.20 below what they were expecting.

“If you were relying on those results to fill your open element of the progress 8 calculation, and you haven’t got anything in there, it will just assume you haven’t got anything in there and give you no points for those slots.”

Duncan Baldwin, deputy policy director at school leaders’ union ASCL, said he was seeking assurances from the DfE that the department itself and not schools would be responsible for amending the data.

“When qualifications are omitted form this data-checking exercise, the purpose of the exercise is that schools can add them back on just in case.

“But when there are huge swathes of results missing, clearly we don’t want to impose workload on schools, so we need to get some reassurances from the DfE that they will do the work.”

Baldwin said he had also approached Ofsted to ask if it would continue to accept schools’ own calculated data, which it has been accepting up until today.

“What Ofsted have said is that after today, that collaborative data can’t be used, but ironically because of the errors in the data that’s come from the DfE, that collaboration data is likely to be more accurate than the DfE’s data.”

A Department for Education spokesperson said: “We are working with our supplier, to provide a solution and will update the checking exercise website as soon as possible.”

Your thoughts

Leave a Reply to Nat Cancel reply

Your email address will not be published. Required fields are marked *

4 Comments

  1. Chris Challis

    My trusts schools have had this problem today. When phoning the help line it was obvious a serious error had happened but we were told the data would be published. Why in education, especially the DfE, doesn’t common sense prevail? You can not publish data that is known to have serious omissions. Apologise, rectify the issue and then publish. Surely that is the only way forward. This could seriously impact on individual schools at a time when Y6 parents make their choices.

  2. We also rang the line and asked if they knew there was a problem. I was told this happens every year and is what the data checking exercise is for. I explained that I understood that if there are a few discrepancies at school level but not if there is a nationwide issue. I also said I thought it was disgraceful that there had been no communication. The schools I run have mainly benefited from this but there will be Headteachers around the country with their heads in their hands wondering what they are going to do. Reply? Check the data or it won’t count and we’ll pass your concerns on

  3. On the phone it sounded like no one had the slightest idea of the scale of the issue. Interesting no one had noticed the sudden nose dive in the nation’s results until schools pointed it out. Mistakes do happen but really? They had one job… It will be good to see Pearson waive late fees for schools who forget future deadlines.

  4. And, a week later, still no updated summary figures. The new figures are in but the headlines are not updated. In this time many schools will have had inspections, senior teams left wondering whether and how to update SEFs and governors losing confidence in school data teams (despite the fact it is not their fault). It strikes me that nobody at Capita (who this exercise has been outsourced to) really appreciates the impact this has had and continues to have.

    Put simply: this government created a headline measure that, until recently, has pretty much decided the outcome of inspection results, the perception of the public towards a school or academy, and shaped the futures and careers of school leaders. When it is messed up, it matters.

    If this is unpalatable, don’t create such high-stakes single metrics at all. After all, we all know education is more nuanced than a flawed progress figure or a 1 to 4 ofsted grade; both of which are crass and crude. But if you insist on creating such measures, jolly well get them judged and calculated correctly, checked and on-time.