ChatGPT

ChatGPT: How to guard against AI-generated essays

We should worry less about students using AI to cheat the system and more about a system that can be cheated by AI, writes Yvonne Williams

We should worry less about students using AI to cheat the system and more about a system that can be cheated by AI, writes Yvonne Williams

7 Jan 2023, 5:00

If we teach students to write like bots, can we be surprised that bots can write like students? As we start a new year, the media has been crowded with articles about the new powers of artificial intelligence (AI). Some envisage that very soon teachers will be unable to distinguish between authentic and AI-generated essays.

While we have not yet reached that point, there are three reasons why we should take this possibility seriously. First, it is a question of values. Students should understand why this kind of intellectual theft is wrong. Schools should make this plain in their codes of conduct and enact the consequences should students choose to cheat.

Second, it brings us closer to the perennial question of what education is and should be for.  The 2015 “reformed qualifications” have brought with them an increasing amount of teaching to the test because the product (the qualification) has become more important than the process (the development of the intellect).

The combination of high-stakes exams and predictability of question types make our system more vulnerable to students successfully substituting AI for authentic writing. Awarding bodies and Ofqual must share the blame: every new specification comes with more detailed guidance.  

In the quest for ever-better results, teachers are encouraged to make tasks predictable. Acronym-based planning (AFOREST anyone?), over-emphasis on paragraph structures (PEE, FART, etc.) and WAGOLLs ensure that students’ essays become the products of an educational assembly line. And the more predictable the expected outcome, the easier it is for an AI essay-creator to accommodate it.

Third, and arguably most importantly, is that if students are to benefit fully from their education, they need to develop their own style. Making students more self-reliant prepares them to compete in the world of work, where original and workable ideas, inventions, and enquiry are valued.

So how should schools respond?

First and foremost, staff should be trained and kept updated on what AI products can actually produce, and the distinguishing features of their output.

Low-trust strategies do not teach students to take moral ownership

Some will argue that the most obvious remedy is to take a low-trust, high-control approach whereby all essays become controlled assignments and all preparation is closely supervised. But low-trust strategies do not teach students to take moral or educational ownership of their work for its own sake. And is a treadmill of in-class assessment the best way to organise a learning journey?

It would be better to make more intensive use of teachers’ knowledge of assessment and their students’ capabilities. Teachers know only too well the laboured style of sentences and paragraphs “lifted” from study guides. They can detect the more mature style that usually indicates a parent’s involvement. And they can spot any deviations from their normal output: handwriting, style, pattern of technical errors and even their willingness to revise a draft. In classroom discussion, students’ idiosyncrasies and attitudes emerge that shape their written work.

To steer clear of answers that are easily replicated by AI, we should encourage our classes to experiment with different structures and give them the space to try, fail and rebuild. Using classroom time for students’ drafting and revising allows teachers to see what might be expected from each one, then compare it with what is handed in.

Finally, school leaders and awarding bodies must support teachers in holding the line against challenges from more assertive students and parents, otherwise inauthentic work will slip through.  Where there is doubt about authenticity, for example, interviews with students about their essays could challenge those whose submissions are out of keeping with what they have produced before and add depth to assessment more generally.

One thing is certain: purchasing the software that creators of AI are developing to detect the inauthentic essays they have programmed their bots to manufacture in the first place is not the solution. Instead, trust, agency, flexibility, integrity and accountability are how we will keep assessment humanly intelligent.

Latest education roles from

IT Technician

IT Technician

Harris Academy Morden

Teacher of Geography

Teacher of Geography

Harris Academy Orpington

Lecturer/Assessor in Electrical

Lecturer/Assessor in Electrical

South Gloucestershire and Stroud College

Director of Management Information Systems (MIS)

Director of Management Information Systems (MIS)

South Gloucestershire and Stroud College

Exams Assistant

Exams Assistant

Richmond and Hillcroft Adult & Community College

Lecturer Electrical Installation

Lecturer Electrical Installation

Solihull College and University Centre

Sponsored posts

Sponsored post

It’s Education’s Time to Shine: Celebrate your Education Community in 2025!

The deadline is approaching to nominate a colleague, team, whole school or college for the 2025 Pearson National Teaching...

SWAdvertorial
Sponsored post

Navigating NPQ Funding Cuts: An Apprenticeship Success Story

Last year’s NPQ funding cuts meant that half of England’s teachers faced costs of up to £4,000 to complete...

SWAdvertorial
Sponsored post

Embedding Formative Assessment: not just a box-ticking exercise but something long-term and meaningful for all

Our EFA programme has been proven to help schools achieve better GCSE results, as evidenced by the EEF. Find...

SWAdvertorial
Sponsored post

Building capacity in family support to tackle low school attendance 

Persistent and severe school absence impacts children, families, and communities—especially in disadvantaged areas. School-Home Support’s Attendance Support and Development Programme...

SWAdvertorial

More from this theme

ChatGPT

ChatGPT: One in three teachers use AI to help with school work

Proportion of teachers and leaders reporting using the technology has doubled in five months

Freddie Whittaker

Your thoughts

Leave a Reply

Your email address will not be published. Required fields are marked *