A school in the UK has been formally reprimanded by the UK's data protection regulator for using facial recognition technology without getting explicit opt-in consent from students to process their face scans.
The news rekindles an ongoing privacy debate over the use of biometric data in school settings, particularly where children are involved. Last year, New York became the first US state to ban facial recognition in schools after AI companies stepped up their marketing efforts promising to make schools safer.
Fingerprint technology has been used in UK schools for a variety of identity and authentication purposes for many years, but facial recognition is also being increasingly used — a trend accelerated by the pandemic's push for fully contactless payments. Some schools have been using facial recognition software to manage meal payments for at least four years, prompting the UK's Information Commissioner's Office (ICO) to take action after the technology began to be used in a number of Scottish schools in 2021.
Nearly three years later, the ICO was forced to act again. Chelmer Valley High School in Chelmsford, Essex, had been using fingerprint authentication since 2016, but from March 2023 it began using facial recognition technology for cashless lunch payments. The facial recognition system was provided by a company called CRB Cunninghams.
Schools are permitted to use facial recognition technology but must first carry out a Data Protection Impact Assessment (DPIA), which the ICO said Chelmer Valley High School had not done before introducing the new biometric technology. The school submitted its DPIA to the ICO in January this year, almost a year after introducing the technology.
Furthermore, the ICO said the school did not have “express permission” to process pupils' facial scans. Although the school had sent letters to parents informing them of the use of the technology, it presented it as an opt-out programme, meaning that if pupils did not return a form explicitly stating that they did not want to take part, they would automatically be included in the programme. This goes against Article 4(11) of the UK GDPR, which states that consent requires “clear affirmative action”.
“The facial photos taken by these systems contain highly sensitive biometric data,” Mark Johnson, advocacy manager at privacy campaign group Big Brother Watch, told TechCrunch. “No child should have to go through a border-style identity check just to get a school meal, so it's encouraging to see the ICO step in. Children should be taught how to manage their personal information, not treated like walking barcodes from school, encouraged to hand over their biometric data on a whim.”
What's more, the UK's GDPR states that children over the age of 13 can consent to how their data is processed, meaning the majority of the school's students were not able to “exercise their rights and freedoms”.
“Ensuring that people's information is handled correctly in a school canteen environment is just as important as handling the food itself,” Lynne Currie, head of privacy innovation at the ICO, said in a statement. “We expect all organisations to carry out the necessary assessments when introducing any new technology to mitigate data protection risks and ensure compliance with data protection law.”
It's worth noting that the ICO has the power to issue large fines to organisations that breach data privacy regulations, as evidenced when US AI company Clearview AI was fined $10 million for a series of breaches. However, the ICO is unlikely to treat a public school in the same way as a private company, and therefore in this case, especially as this was the school's first breach, it decided a public reprimand was more appropriate.
“We have taken action against this school to show that introducing measures like FRT is not something that should be taken lightly, particularly where children are concerned,” Mr Currie added. “We do not want this action to discourage other schools from adopting new technology, but it must be done right, putting data protection first, championing trust, protecting children's privacy and upholding children's rights.”
TechCrunch has reached out to Chelmer Valley High School for comment and will update here if we hear back.