A school has been rapped by the information watchdog after it illegally used facial recognition on its pupils, prompting a call for leaders to safeguard children’s rights when implementing new technology.
The Information Commissioner’s Office has issued a formal reprimand to Chelmer Valley High School in Essex following “infringements” of the UK’s general data protection regulations (GDPR) relating to the processing of biometric data.
The school failed to assess the impact of technology introduced to manage its cashless catering system before it was introduced, and did not get explicit consent from its pupils, the ICO ruled.
Lynne Currie, ICO head of privacy innovation, said: “Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself.
“We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws.”
She said introducing measures such as facial recognition technology “should not be taken lightly, particularly when it involves children”.
“We don’t want this to deter other schools from embracing new technologies. But this must be done correctly with data protection at the forefront, championing trust, protecting children’s privacy and safeguarding their rights.”
‘Facial recognition has no place in schools’
But Jen Persson, director of data rights campaign organisation Defend Digital Me, said facial recognition “has no place in schools”.
“It’s high-tech with associated risks for life, and it’s not right to be using it to buy a lunch.”
When introducing new ways of processing personal data that are deemed a “high risk” to rights and freedoms, GDPR rules state that data controllers, in this case the school, have to assess the impact it will have on the protection of personal data.
They also have to get consent from those whose data is being used.
The 1,200-pupil academy in Chelmsford introduced facial recognition technology in March 2023. It replaced fingerprint recognition technology used since 2016 to manage catering payments.
The facial recognition technology was provided by a firm called CRB Cunninghams.
The school’s data protection officer contacted the ICO in January this year and provided a data protection impact assessment completed last November. They “considered the processing to be high risk, and submitted the DPIA for review”.
But they also confirmed “that no DPIA had been completed for the introduction of facial recognition technology prior to the processing commencing in March 2023”.
School had relied on assumed consent
Through further correspondence, the ICO “established that from March to November 2023 the controller had been relying on assumed consent for facial recognition, except where parents or carers had opted children out of the processing”.
GDPR rules are “clear that consent requires an affirmative action, and as such consent on an opt-out basis would not have been valid or lawful”.
“Further to this, the majority of students would have been considered sufficiently competent to provide their own consent. The parental opt-out deprived students of the ability to exercise their rights and freedoms in relation to the processing between March and November 2023.”
The school also “failed to seek advice from their DPO in relation to the introduction of the facial recognition technology, nor did they consult with parents or students before commencing with the processing”.
The ICO “believes that had Chelmer Valley High School sought advice from their DPO, many of the compliance issues would have been identified prior to the processing commencing”.
“Chelmer Valley High School has therefore failed to complete a DPIA where they were legally required to do so.
“This failing meant that no prior assessment was made of the risks to data subjects, no consideration was given to lawfully managing consent, and students at the school were then left unable to properly exercise their rights and freedoms.”
‘Engage more closely’ on data protection
The school has since “refreshed” consents by obtaining “explicit opt-in consent from students”.
The ICO made a series of recommendations, but the school is not under a legal obligation to follow them.
They told the school to amend its impact assessment to “give thorough consideration to the necessity and proportionality of cashless catering, and to mitigating specific, additional risks such as bias and discrimination”.
The school should also “review and follow all ICO guidance for schools considering whether to use facial recognition for cashless catering”.
And it should “engage more closely and in a timely fashion with their data protection officer when considering new projects or operations processing personal data, and document their advice and any changes to the processing that are made as a result”.
Schools lack training and capacity
The use or proposed use of facial recognition technology in schools has prompted widespread criticism in the past.
In 2021, Schools Week reported how at least two schools in England were reversing their plans to install facial recognition systems in their canteens after a Scottish council faced widespread criticism for its use of the technology.
North Ayrshire Council subsequently put its use of the software on hold.
Persson added that policymakers “need to stop passing the buck here”.
“Schools do not have the training or capacity to procure technology in ways that respects the law or children’s rights.
“We need a national office for oversight, quality controls on procurement access to children via the public sector, and to get a grip on what this costs the state now and for our future society.”
The school was approached for comment.
Your thoughts