Exams

OCR to review difficulty of GCSE computer science paper

Follows backlash over exam leaving pupils 'completely disheartened' and warning it will put more youngsters off the subject

Follows backlash over exam leaving pupils 'completely disheartened' and warning it will put more youngsters off the subject

31 May 2023, 12:02

More from this author

An exam board has said it will “look at the difficulty” of a GCSE computer science paper after complaints that it left high-achieving pupils “traumatised” and “completely disheartened”.

While OCR has pledged a “fair outcome for all” with the potential to lower grade boundaries if deemed necessary, teachers and experts fear it could put children off taking the subject.

Pupils typically achieve a grade lower in computing compared to other comparable GCSEs, analysis by BCS, The Chartered Institute for IT, suggests.

The issue of whether computing is harder is also already being investigated by regulator Ofqual.

Teachers and students took to social media to complain about the J277 GCSE (9-1) paper 2 on Thursday afternoon.

Kaajal Chohan, head of computing at a Leicester school, tweeted the paper was “awful” with pupils left “completely disheartened”.

“The level of difficultly was unnecessary and nothing like the previous papers. it has put so many students off choosing this as an A Level subject.”

Another computer science teacher said the paper had left her students “traumatised”.

An OCR spokesperson said it was aware “some students found this to be a challenging paper”.

“We will look at the difficulty of the paper during the marking and awarding process,” they added.

“As our senior examiners review student answers on this exam paper, their options will include adjusting mark schemes and setting grade boundaries at an appropriate level to ensure a fair outcome for all.”

Students ‘very upset’ after exam

Specific concerns raised by teachers were that the exam contained “too many wordy questions”, while the paper was “a huge step up” from previous ones.

Complaints have also been shared via a closed Facebook group for the subject, which is administrated by OCR employees.

In one comment, a teacher said “students who were getting grade 9s in mocks were very upset with it.

“One of my students who is really advanced for his level…came up to me and said that he only just about understood it.”

One computer science teacher based in East Sussex told Schools Week she was considering switching to a different exam board for new GCSE pupils because of the difficulty of the paper.

Pupils also voiced their frustration, with one taking to social media to declare it “the worst exam ive [sic] ever sat in my LIFE”.

Several posts about the paper also surfaced on video-sharing platform TikTok, including one of a pupil with his head in his hands in a school yard alongside the caption “wtf was that”.

‘Struggle’ to get pupils to take computer science

Miles Berry, professor of computer education at the University of Roehampton, said the subject was difficult to assess in general because written exams offered “a very unrepresentative experience of problem solving using programming”.

“I think it would be lovely to get back to something which gives a more realistic experience of programming,” he added.

He said as the subject is “already known” to be more difficult, it wouldn’t put future pupils off.

But he added “it’s possible that we’ll have had some [pupils] who had such a negative experience of this paper that they decide not to carry on with the A-level”.

The number of pupils taking GCSE computing has increased over the past six years by more than 20 per cent, compared to a rise of just five per cent across all GCSEs. However, momentum has stalled since 2019, Datalab figures show.

At A-level, the number of pupils in England taking the subject has risen by more than 80 per cent to 14,800 last year.

The National Centre for Computing Education, backed by £84 million in government funding, was set up in 2018 to improve teaching and drive up participation in the subject.

BCS, The Chartered Institute for IT, said it had “picked up some concerns” about the paper’s level of difficulty from the subject association group Computing at School.

But its head of education, Niel McLean, said it needed “to see how this translates into the grades awarded”.

BCS has previously raised concerns with Ofqual about the difficulties in attaining a high grade in the subject at GCSE.

In a 2021 letter to the regulator, BCS said this “leaves computing teachers with a huge struggle to persuade their senior leaders to resource GCSE Computer Science, and their students to take it”.

In a response in December that year, chief regulator Jo Saxton said the analysis “raises some questions that I find warrant further investigation”.

She added that developing a suitably rigorous evidence base would take time given the absence of exams in 2020 and 2021.

Ofqual is currently “considering” the concerns raised, including reviewing standards in GCSE computer science.

An Ofqual spokesperson said it was “aware” of concerns raised about the paper.

“As in any year, grade boundaries for every specification will be set by senior examiners after they have reviewed the work produced by students in the assessments. This will take into account how difficult each specific paper was,” they added.

“The difficulty of exam papers varies each year and Ofqual regulates exam boards to secure that qualifications are awarded fairly and consistently.” 

More from this theme

Exams

Deprived schools more likely to see progress 8 scores fall

Analysis comes as Covid impact and potential Labour changes may spell end to measure in its current form

Freddie Whittaker
Exams

Progress 8 pause: Heads call for wider review

But some heads have warned the sector could creep back to GCSE pass grades being the accountability 'king'

Samantha Booth
Exams

No school progress measure for next two years

The Department for Education had explored alternative options, but concluded there is 'no replacement' for progress 8 measure

Samantha Booth
Exams

Unions: ‘Clunky’ advanced British standard risks ‘blunt choice’ for pupils

Ministers accused of 'putting the cart before the horse' with 16-19 reform plans

Freddie Whittaker
Exams

DfE puts 40 staff on Advanced British Standard ‘vanity project’

Government criticised for committing 'platoon of civil servants' to policy unlikely to come to fruition

Freddie Whittaker
Exams

Cyber attack: Exam boards told to introduce new security measures

Ofqual chief Sir Ian Bauckham said regulator will undertake 'rigorous' checks on exam board plans to move tests on-screen

Samantha Booth

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

11 Comments

  1. I don’t have a paper to hand for details, but I was shown a mistake in an array question.

    The name the array was called in the question had an extra word on the front compared to the array title shown below it.
    This lead to doubt if there was a missing table and there were two arrays with similar but different titles.

  2. Annomous

    My daughter took computer science as a GCSE and wishes she had never taken the subject as the curriculum is so difficult let alone the exam

    • anonymous

      i agree this paper was a disgrace and i am currently in year 11 who has just done this paper and i hated every question of this flipping paper . MADE NO SENSE
      fire MR/MRS OCR

  3. “I am very disappointed in OCR for the content of paper 2 Computer Science. Firstly, many of my students were very upset during and after sitting the exam because it did not resemble past papers or the specification. My students have sat many past papers as part of the revision process, and according to the mark scheme, performed well. The main issue I have with the May 2023 paper was Section B, which carried 30 marks.

    The context of the section was control technology, which is not part of the specification, and I believe it was this, that initially ‘confused’ students. They had also sat a Maths exam in the morning, so they were understandably tired at this point.

    Here are the issues I have with the actual questions:

    Question 6f (i) – The table uses the variable identifier “arrayEvents” and then refers to the same variable as “events”, I found myself unclear on how to proceed, and at best, the question was ambiguous.
    Question 6f (ii) asked students to write a program in a high-level programming language they have studied (Python in our case). However, the program they had to write, stores data in a 2D array, and 2D arrays are not native in Python, this would have confused students and made it difficult to confidently write the program.

    Many of the actual questions were overly long and complex, it required students to re-read the questions to truly understand what was being asked of them.

    I gave this paper (section B) to my ‘A Level Computer Science students’. For the first five minutes they did attempt the paper on their own, but then said they needed to clarify points, so it ended up a discussion about what the questions were actually asking. They felt there was so many different tables, and often they were not clear on which were the rows and columns, the formats seemed to frequently interchange. They, also had to keep going back to previous points/pages/tables to retain information in their heads which they found frustrating. They did agree that it was nothing like last year’s (or previous years) papers and were grateful that they did not have to sit this one.

    I have felt for a while that OCR asks students to write algorithms/programs independently, the true nature of producing these in the real world is through group work and co-operative learning. Our school is inclusive, so we have students of varying pathways, this paper, I felt, was totally exclusive to the higher mark bands, mainly due to the wording of the questions which required a high level of literacy to understand. We are now seriously considering looking at alternative courses because many of our Year 11s are thinking of dropping A Level Computer Science as a consequence of this paper.

    I do not believe that OCR is in the business of setting students up to fail, but in all my years of teaching, I have never seen a response like this from students, and based on the paper, there is credence to their claims.

  4. Anonymous

    As a student who just took the papers I have mixed feelings, initially I found the paper 2 (programming) rather easy but after review I realised although all the question were on the syllabus they were really difficult. I have 4 years of coding experience and there were still times I was question myself which make me realise how student who only had the required GCSE level of understand of programming would have struggled deeply on the later questions. There is nothing they can do now to change the paper but when marking it would be smart to award marks for the intent of the code rather than the full correctness of it was in order to check the student analytical ability.

  5. Mrs Kendrick

    I am very disappointed in OCR for the content of paper 2 Computer Science. Firstly, many of my students were very upset during and after sitting the exam because it did not resemble past papers or the specification. My students have sat many past papers as part of the revision process, and according to the mark scheme, performed well. The main issue I have with the May 2023 paper was Section B, which carried 30 marks.

    The context of the section was control technology, which is not part of the specification, and I believe it was this, that initially ‘confused’ students. They had also sat a Maths exam in the morning, so they were understandably tired at this point.

    Here are the issues I have with the actual questions:

    Question 6f (i) – The table uses the variable identifier “arrayEvents” and then refers to the same variable as “events”, I found myself unclear on how to proceed, and at best, the question was ambiguous.
    Question 6f (ii) asked students to write a program in a high-level programming language they have studied (Python in our case). However, the program they had to write, stores data in a 2D array, and 2D arrays are not native in Python, this would have confused students and made it difficult to confidently write the program.

    Many of the actual questions were overly long and complex, it required students to re-read the questions to truly understand what was being asked of them.

    I gave this paper (section B) to my ‘A Level Computer Science students’. For the first five minutes they did attempt the paper on their own, but then said they needed to clarify points, so it ended up a discussion about what the questions were actually asking. They felt there was so many different tables, and often they were not clear on which were the rows and columns, the formats seemed to frequently interchange. They, also had to keep going back to previous points/pages/tables to retain information in their heads which they found frustrating. They did agree that it was nothing like last year’s (or previous years) papers and were grateful that they did not have to sit this one.

    I have felt for a while that OCR asks students to write algorithms/programs independently, the true nature of producing these in the real world is through group work and co-operative learning. Our school is inclusive, so we have students of varying pathways, this paper, I felt, was totally exclusive to the higher mark bands, mainly due to the wording of the questions which required a high level of literacy to understand. We are now seriously considering looking at alternative courses because many of our Year 11s are thinking of dropping A Level Computer Science as a consequence of this paper.

    I do not believe that OCR is in the business of setting students up to fail, but in all my years of teaching, I have never seen a response like this from students, and based on the paper, there is credence to their claims.

  6. It is results day and the grade boundaries have been released.
    The OCR Computer Science grade boundary have INCREASED compared to last year.
    They apparently failed to take the huge factor of paper 2 being A Level standard.
    While my underachievement in the subject isn’t substantial (grade 7 when I have always got 9s) these grade boundaries have prevented others from passing the subject.
    OCR will get away with ruining another year 🙂

  7. Samueli

    My daughter has been getting 8 and 9 in computer science. She said it was the hardest exam she had ever taken. It also had mistakes in it. She failed today – getting a very unexpected grade 3. A young man in her class who was expected to fly in this subject also failed and got an extremely shocking and devastating 3. This computer science paper has to be looked into ASAP.