The proportion of pupils achieving expected standards in science has dropped by more than 20 percentage points since the national curriculum was updated, government figures showed last week.

A report by the Standards and Testing Agency (STA) published on its website last week showed that the proportion of pupils achieving level 4 or above in science has dropped from 84 per cent in 2012 to 63 per cent in 2014.

The proportion achieving a level 5 has dropped even further – from 36 per cent to 11 per cent.

The STA claims that a new methodology for sampling pupil achievements has caused the decline, but teachers and assessment experts believe it shows that teachers are spending less time on the subject.

Science SAT tests for all 11-year-olds were last taken in 2009, with only a sample of pupils from a range of schools across the country taking them from 2010 onwards. Between 2010 and 2012, 750 schools were told in February their year 6 pupils would sit a science test during SATs week in May, and schools were given pupils results. There were no tests in 2013.

In 2014, after the introduction of the new national curriculum, a new sampling methodology was introduced. That year, just five pupils from 1,900 schools were selected for testing, using a new selection of questions. The tests were taken at a different time to SATs and schools no longer received individual pupils’ results.

In its report, the STA argues that these changes were likely to have had “some impact on school behaviour and pupil motivation” meaning that pupils in 2012 were likely to be “test ready” whereas those in 2014 were not.

But Anne Goldsworthy, a former primary teacher and science specialist, said she believed the results reflected a downgrading in the importance of science teaching since the removal of externally marked SATs: “As schools are longer accountable for results, [science] has slipped out of focus.”

Science attainment in primary schools is now assessed by teachers. In 2014, 88 per cent of children were found to be achieving at level 4 or above, and 89 per cent last year. This contrasts significantly with the sample test results.

Jane Turner, director of the Primary Science Quality Mark, agreed that there was “less science teaching going on since it was removed from the accountability measures” but also said the “test results cannot be used to compare to previous years due to the differences in methodology”.

The STA said in the release that results from primary maths tests were better indicators of performance in GCSE science than primary science tests.

The results were also published more than a year later than promised, which a Department for Education (DfE) spokesperson said was because a “special analyst” was needed.

She added: “As part of our mission to raise the bar, we have made changes to KS2 science sampling tests, to bring these in line with international benchmarks and to provide a robust measure of national standards over time, helping us to deliver our commitment to make Britain the best place in the world to study maths, science and engineering.”

The findings raise awkward questions for the government’s other assessment initiatives. The national reference test, which will sample a range of GCSE pupils shortly before their exams, will be piloted by exams regulator Ofqual this March. Pupils’ scores are planned to be used to set grade boundaries.

School leaders have raised concerns with Schools Week that if pupils get low scores on the tests, because they are not “test ready”, then this could negatively affect the grades available for the entire country.

Ofqual has said that it is only “trialling” the approach and the results will only provide “an additional source of information” when making decisions about grades.