News

The 6 Key Trends in the KS2 Primary Test Results

See the latest SATs data from 15 Dec, 2016: Fewer primary schools fall below the floor standards and 7 key findings from the 2016 primary league tables 

Original article:

Provisional results of the primary school key stage 2 SATs tests broken down at local authority and regional level have been released by the Department for Education (DfE) this morning.

This is the first year that scaled scores replace “levels”. To meet government expectations, pupils must achieve 100 in their scaled scores, as opposed to the old expected standard of level 4.

We found out in July that just half (53 per cent) of year 6 pupils met the new expected standard. The DfE has today confirmed those results and again stressed that this figure is not comparable to last year – when 80 per cent of pupils achieved a level 4 – because of changes in the national curriculum and accountability framework.

Here are the key points from today’s new data.

 

1. Local authority data shows big differences in performance across the country

London continues to dominate the top spots for pupils who achieved the expected standard of 100 or above in reading, writing and maths this year. The following two tables show the authorities with the highest (left) and lowest (right) scores.

Top 2016Bottom 2016

 

 

 

2. Some local authorities have dropped scores

Although the results are not directly comparable, we can identify whether authorities have moved up or down as a result of the new scaled scores system. Kensington & Chelsea and Richmond Upon Thames kept their top spots but have dropped more than 20 percentage points of pupils achieving the expected standard under the new system. This is less than the national drop, which saw the overall percentage of pupils achieving the expected rate move from 80 per cent to 52 per cent.

Redcar and Cleveland, which had a pass rate of 86 per cent last year, dropped to 59 per cent – falling out of the top 10.

The below two tables show the highest performing authorities for 2015 (left) and 2016 (right).

KS2-LA-best 2015Top 2016

 

 

 

 

 

 

 

 

 

 

 

 

 

Medway, the authority with the lowest pass rate last year, has performed much better under the new system and has moved 32 places higher with 48 per cent of pupils now achieving 100 or above.

Dorset, Liverpool, West Sussex, Swindon, Oldham and Stoke-on-Trent, all authorities that did not feature in the lowest scoring table last year, have dropped into the bottom 10 under the new system.

The below two tables show the lowest performing authorities for 2015 (left) and 2016 (right).

worst 2015Bottom 2016

 

 

 

 

 

 

 

 

 

 

 

 

 

4. Academies performed similarly to local authority schools

There are minimal differences in outcomes at academies compared to local authority maintained schools – although, overall, local authority maintained schools performed slightly better – by 1 per cent across almost all areas. Converter academies – those rated good or outstanding by Ofsted at the time of conversion to academy status – have a higher percentage of pupils achieving the expected standard than the average for all mainstream schools.

 

Different school types

 

5. Girls outperform boys in all areas.

For reading, writing and maths, girls achieved 57 per cent compared to 50 per cent of boys. The gap is largest in writing.

 

Gender

6. Schools minister Nick Gibb said the results show schools are meeting “high standards”

Schools minister Nick Gibb said that thanks to the country’s new focus on “raising standards”, the majority of pupils have performed “well” in this year’s tests.

He added: “These figures show that many schools and local authorities have risen to the challenge and have delivered high standards but we want that success to be the standard everywhere. We have made great strides with over 1.4million more pupils in good or outstanding schools than in 2010 but the government’s objective is to extend that opportunity so every child has the excellent education they deserve.”

Save

Save

Your thoughts

Leave a Reply to Sandy Cameron Cancel reply

Your email address will not be published. Required fields are marked *

14 Comments

  1. 1.4million more pupils in good or outstanding schools. Yes of course there are more pupils in good or outstanding schools – there are more pupils because the school age population is growing!
    I wish that instead of using misleading statistics like this Nick Gibb would quote the proportion of children in good or outstanding schools which at least allows something approaching a like for like comparison. It should be noted that the majority of additional pupils in good or outstanding schools are in good or outstanding LA maintained primary schools. If Nick Gibb wants to use these stats to crow about the government’s success in improving schools perhaps he should give credit where it is actually due.

  2. Is there not an important point here that free schools, recipients of huge quantities of public money and set up with the promise of raising standards, are performing well below LA schools as a block? I cant see any reference to this in either the article, or the minister’s somewhat fatuous statement.

  3. wizzobravo

    Does Nick Gibb have a brain? These results represent an absolute f***ing disaster and he’s trying to spin it to a fantastically hostile and well informed audience.

    An altogether more sensible thing would have been to say, “Hmm, this years results represent an exceptional effort on the part of schools to engage with the new curriculum. However, it is now clear that there are issues relating to how appropriate the level of difficulty was for each assessment.”

    We head into the next year facing industrial action over this. I despair.

    Secondaries will be shaking in their boots over these results. Y7 Resits? Linking GCSE results to KS2 performance? We are literally about to screw up our assessment system on a grand scale just because NG is too stiff-necked to engage with school leaders and teachers.

    The DfE has now artificially labelled our primary school system as failing through arbitrary shifts in difficulty levels of tests. Madness! Ordinarily, you could not make it up but apparently someone did. I wonder if they were on the receipt of one of David Cameron’s 24% pay rises?

  4. Can anyone tell me what “the expected standard” actually means? In plain English please, not in Gibb-erish. (Serious question)

    Perhaps Schools Week could run a competition to find the best answer.

    Then we could stuff the winner’s medal up Mr Gibb’s fronted adverbial. (not so serious but would get a lot of views on YouTube)

    Please can teachers have back even a modicum of influence over the curriculum. “Pretty please”, Prime Minister Mrs May.

    • The ‘expected standard’ is supposedly ‘100 or above in reading, writing and maths’. But the tests are flawed. It follows that the mandatory ‘expected standard’ will also be flawed.
      Shakespeare would fail the KS2 test. His use of exclamation marks in Macbeth would not be considered ‘creditworthy’. Jane Austin would fail on spelling. She spelled ‘choose’ as ‘chuse’ and ‘stayed’ as ‘staid’ in Pride and Prejudice. Hemingway would undoubtedly fail for not varying sentence length. And Gibb’s ‘wrong’ answer to a grammar question on Today shows how misguided this kind of testing actually is. http://www.localschoolsnetwork.org.uk/2016/06/gibbs-grammar-was-right-but-hes-also-profoundly-wrong

      • But the score of 100 can be anything the minister chooses.

        “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.” “The question is,” said Alice, “whether you can make words mean so many different things.”

  5. The system has changed from being a criterion based one, focussed on content that any number of children can master, to that based on population norms. Therefore rather than an 80% pass above level 4 showing that number of children gaining a good standard, the average nationally can only be 50%, representing the 100 score. The focus has gone from the skills and criteria gained to a comparison between schools and pupils. As we can see this reflects advantaged children and areas. More worrying is the effect on classroom ethos as children are increasingly seen to directly compete against each other for their place in the class and therefore the social hierarchy and life chances.
    In my opinion an 80% pass rate in the former system seems far more motivating, democratic and less elitist and more conducive to a healthy society.

  6. Sandy Cameron

    Statistical trends is one thing – premature conclusions are another. Unless there is proper scrutiny of LA data, it remains nothing more than a raw league table, about as acceptable as the performance tables used for schools.

    For example, why should higher or lower writing results be down to harsher or more lenient moderation? Doesn’t it rather depend on the results submitted for moderation in the first place? And also the size of the sample moderated. The minimum is 25%, but are there some LAs that moderate a much larger sample?

    Whatever is going on in local authorities, it won’t be properly understood unless someone does some proper research. In any case, grouping schools by local authority might be regarded as little more than an arbitrary grouping based on nothing more than political geography. If schools are already largely autonomous institutions, then the LA it resides is should make little difference…unless the small matter of variable LA formulas is taken into account, of course!