Dismissing the educational potential of AI is wasted energy. Worse, it could leave the sector exposed, writes Priya Lakhani
There’s an idea in psychology called the ‘thought-terminating cliché’. Popularised by psychiatrists analysing the effects of political propaganda, the term describes the deliberate use of loaded, emotive language to quell critical thinking. Trite sayings or proverbs are used to end a discussion without actually addressing the argument at hand.
It’s a rational conflict avoidance tool, and you can find thought-terminating clichés everywhere, often used benignly. “It is what it is.” “That’s just how it’s done here.” “I’m just saying.” And now, the thought-terminating cliché is thriving in the world of technology.
Take, for example, “AI is just a marketing gimmick”. Okay. I’m as baffled as anyone by AI-powered cat flaps. And okay, a study of almost 3,000 startups found that 40 per cent of European AI startups don’t actually use AI in any meaningful way. There is valid criticism to be made against companies found to be making false claims about their technological prowess and thereby undermining confidence in genuine innovations.
But that shouldn’t lead us into generalisations about the technology itself. Accusations of gimmickry are too often made without evidence, and increasingly so in the world of education technology. Claiming that “AI is just a marketing gimmick” has itself become a marketing gimmick, used both as a fig leaf by less technologically-advanced edtech innovators and as a thought-terminating cliché by ideological technophobes.
Developing an AI is no mean feat – it takes years, millions of pounds and a talented team of engineers (and in the case of CENTURY, teachers and neuroscientists too) to develop. Genuine AI products use a range of machine learning algorithms and systems that are fed masses of data to enable them to make decisions and learning recommendations.
The gimmickry cliché prevents us from focusing on potentially serious ethical challenges
Forming a forceful argument against an advanced technology that is helping teachers and students to thrive in schools as diverse as Eton College, Michaela Community School and schools educating Syrian refugees in the Middle East requires a lot more cognitive work than reaching for your trusty cliché.
And yet there are plenty of legitimate criticisms to be made of education technology more generally. That classroom cupboards often resemble a scrapyard of failed technologies is testament to that. But while we should handle innovation with prudence, we shouldn’t let past mistakes restrict present and future successes.
Teaching will always be a human-led sector. To the disappointment of nobody but the most zealous techno-evangelists, robots will never take over the classroom. In fact, they barely exist. Instead, AI’s role is twofold: giving each child an education tailored to their needs and liberating teachers from the administrative work that causes so many to burn out.
Reformers should be setting out to ensure these aims are met ethically. Unlike the relics clogging up classroom cupboards, AI has the potential to be more transformative for education than any previous technology. But making good on that promise will require us to leave behind the lazy cliché of gimmickry.
The facts are that AI’s prominence across all facets of our lives is steadily increasing, that it is forecast to surge over the next few decades, and that Britain’s AI sector is currently the third largest in the world. Education – our second biggest economic sector – will only increasingly come under its influence.
And dismissing it as gimmickry actually prevents us from focusing on the real and potentially serious ethical challenges it presents, such as data privacy and algorithmic bias. Should these critics succeed, an educational and economic powerhouse would be lost. But they are unlikely to, and when they fail these ethical questions will still need solving.
In my new role as a non-executive board member at the Department for Digital, Culture, Media & Sport, I’m hoping to help policymakers to ensure all sectors – including education – benefit from these technologies ethically.
Meanwhile, a framework thankfully already exists to support leaders and teachers to deploy them safely in schools and classrooms. But doing so requires careful attention from all stakeholders. And that has to start with dropping the clichés.