Exams

ChatGPT: Exam boards publish AI guidance for schools

JCQ tells schools to make pupils do some coursework 'under direct supervision' amid cheating fears

JCQ tells schools to make pupils do some coursework 'under direct supervision' amid cheating fears

Schools should make students do some coursework in class “under direct supervision” to make sure they are not cheating amid fears about artificial intelligence (AI) such as ChatGPT, new exam board guidance states.

The Joint Council for Qualifications (JCQ) – which represents boards – has published guidance for schools today on “protecting the integrity of qualifications”.

While the majority of qualifications are exam-based and unaffected by AI, there are some assessments such as coursework which allow access to the internet.

It follows reports of schools scrapping homework for fears of cheating as top universities ban the use of AI in coursework and exams.

Here’s what schools need to know…

1. Misuse of AI is malpractice

JCQ said chatbots may pose “significant risks” if used by students completing assessments. They can often produce incorrect answers, biased information or fake references, the guidance reads. 

Students who misuse AI – where the work is not their own – will have committed malpractice and may attract “severe sanctions”. Any use of AI which means students have not “independently demonstrated their own attainment” is likely to be considered malpractice. 

Sanctions for “making a false declaration of authenticity” and “plagiarism” include disqualification and being barred from taking qualifications.

Schools policies should address “the risks associated with AI misuse” and staff should communicate the importance of independent work to students. 

2. …but AI tools can be used

The exam boards said AI tools must only be used when the conditions of the assessment permit the use of the internet and where students are able to demonstrate the final submission is their “own independent work and independent thinking”.

Students must appropriately reference where they have used AI. For instance, if they use AI to find sources of content, the sources must be verified by students and referenced.

So teachers can check whether AI use was appropriate, students must “acknowledge its use and show clearly how they have used it”. 

Students must keep a copy of the questions and AI answers for reference and authentication purposes. But it must be non-editable – such as a screenshot – and provide a brief explanation of how it was used and submitted with the work. 

3. Consider supervised work and restricting AI in schools

JCQ has set out a list of actions that schools should take to prevent misuse – many of which are “already in place in centres and are not new requirements”, they added.

Actions include considering whether students should sign a declaration on understanding what AI misuse is.

Schools should consider restricting access to online AI tools on their devices and networks, including those used in exams. 

“Where appropriate”, schools should be “allocating time for sufficient portions of work to be done in class under direct supervision to allow the teacher to authenticate each student’s whole work with confidence”.

This is similar to what Ofqual boss Dr Jo Saxton suggested earlier this month.

Schools should consider whether it’s “appropriate and helpful” to have a “short verbal discussion” with students about their work to confirm “they understand it and that it reflect their own independent work”. 

Teachers should also examine “intermediate stages” in the production of work to make sure their final submission “represents a natural continuation of earlier stages”. 

4. Look out for typed work and hyperbolic language

JCQ says identifying AI misuse requires the “same skills and observation techniques” teachers already use to check students’ work is their own. For instance comparing it against their previous work to check for unusual changes. 

Potential indicators of AI include default use of American spellings as well as vocabulary which might not be appropriate for the qualification level. 

Others are where a student has handed in work in a typed format, when their usual output is handwritten. Staff should also keep an eye out for “overly verbose or hyperbolic language” that may not be in keeping with a student’s usual style. 

JCQ points to several services – such as GPTZero and OpenAI Classifier – which can determine the likelihood text was produced by AI. 

5. ‘Detected or suspected’ misuse should be reported

If a teacher’s suspicions are confirmed and the students have not signed the declaration of authentication, a school does not need to report malpractice to the exam board. The matter can be resolved prior to any declaration signing. 

But if this has been signed and AI misuse is “detected or suspected” by the school, the case must be reported to the relevant exam board. 

If misuse is suspected by an exam board marker, or it has been reported, full details will usually be relayed to the school. The board will then consider the case and “if necessary” impose a sanction.

Staff should not accept – without further investigation – work they suspect has been taken from AI tools as this could encourage the spread of the practice. It could also constitute sanctions under staff malpractice. 

More from this theme

Exams

Unions: ‘Clunky’ advanced British standard risks ‘blunt choice’ for pupils

Ministers accused of 'putting the cart before the horse' with 16-19 reform plans

Freddie Whittaker
Exams

DfE puts 40 staff on Advanced British Standard ‘vanity project’

Government criticised for committing 'platoon of civil servants' to policy unlikely to come to fruition

Freddie Whittaker
Exams

Cyber attack: Exam boards told to introduce new security measures

Ofqual chief Sir Ian Bauckham said regulator will undertake 'rigorous' checks on exam board plans to move tests on-screen

Samantha Booth
Exams

AQA to launch free digital maths tests for schools

But England's largest exam board has delayed plans to introduce on-screen exams

Freddie Whittaker
Exams

Exam paper cyber attack investigation hits dead end

Two arrested stood down from bail as 'no further evidence'

Samantha Booth
Exams

Ministers mull scrapping Gove’s Russell Group school metric

It follows calls from a House of Lords committee to review destination measures

Samantha Booth

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *

One comment