On 4 November 2021, the House of Lords is due to consider the following question for short debate:
Lord Clement-Jones (Liberal Democrat) to ask Her Majesty’s Government what assessment they have made of the use of facial and other biometric recognition technologies in schools.
What are biometric recognition technologies?
The International Organization for Standardization has defined biometrics/biometric recognition as the “automated recognition of individuals based on their biological and behavioural characteristics”. There are many types of biometrics such as DNA matching, fingerprint recognition and facial recognition.
The Information Commissioner’s Office (ICO) has stated that biometric data is particularly sensitive because it is more permanent and harder to change than other personal data. In addition, it can be used to estimate or infer other characteristics, such as age, sex, gender or ethnicity.
Facial recognition technology
Explaining how facial recognition technology (FRT) works, the ICO said that cameras are used to capture images which FRT software uses to produce a biometric template. The system will usually then estimate the degree of similarity between two facial templates to identify a match and verify someone’s identity, or place a template in a particular category, for example an age group. The technology has a variety of uses and could enable someone to unlock a mobile phone, set up a bank account online or pass through passport control.
The ICO highlighted that while this type of technology usually involves a one-on-one process—where the individual takes part directly and is aware of how their data is being used—live facial recognition (LFR) is different. LFR is typically deployed in a similar way to CCTV and is directed towards everyone in an area. It can capture the biometric data of all individuals who pass within range of the camera “automatically and indiscriminately”. This data is collected in real time and potentially on a mass scale. The ICO said that “there is often a lack of awareness, choice or control for the individual in this process”.
Is facial recognition technology used in schools?
In October 2021, press reports highlighted that nine schools in North Ayrshire had begun taking payments for school lunches by scanning the faces of pupils. The schools said that the new system would speed up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously (concerns have previously been raised about the use of fingerprint recognition). It has been reported that 97 percent of children or their parents had given consent for the new system.
Explaining why the technology had been introduced, David Swanson, managing director of CRB Cunninghams, the company that installed the systems, argued that when a secondary school has around 25 minutes to serve up to 1,000 pupils, transaction time is important. He said that FRT could cut the average transaction time to five seconds per pupil. Mr Swanson also argued that the technology used is different from LFR.
Another school, in Ashton-under-Lyme, has also decided to drop its rollout of a facial recognition system that would have been “an upgrade to the catering cashless system”.
What concerns have been raised?
Responding to the press reports about the use of FRT in schools, the ICO said that “organisations need to carefully consider the necessity and proportionality of collecting biometric data before they do so” and that they “should consider using a different approach if the same goal can be achieved in a less intrusive manner”.
The digital rights group Defend Digital Me contends that the use of facial recognition in schools is “an excessive interference with children’s right to [the] protection of their privacy”. It also argued that no consent can be freely given when the power imbalance with the authority is such that it makes it hard to refuse. It claimed that the notifications sent to parents it had seen were worded to make acceptance seem compulsory. Therefore, high uptake of a scheme could be mistaken for approval. It also argued that the use of FRT for such purposes is likely unlawful under UK General Data Protection Regulation (GDPR), highlighting court cases in France and Sweden where schools were told to stop using FRT.
Silkie Carlo from the campaign group Big Brother Watch has also argued that “no child should have to go through border-style identity checks just to get a school meal”. She said that biometrics are “highly sensitive, personal data that children should be taught to protect” and raised concerns that the biometrics company had not disclosed who else the children’s personal information could be shared with.
In addition, concerns have been raised about the use of FRT in public settings more widely. The Centre for Data Ethics and Innovation, a government expert body, has summed up some of these concerns in its briefing on FRT use in the UK. It found that while FRT can provide additional security when accessing devices and places, and increased efficiency in a number of settings, the rise in its use has concerned civil society groups and political leaders. It said that objections centre on the potential for some uses of FRT, particularly in public settings, to:
- undermine individual privacy;
- entrench bias and unequally distributed systems, especially where systems have different accuracy rates for different demographic groups; and
- bestow private and public organisations with disproportionate power to surveil the population, potentially leading to worrying consequences for rights such as freedom of expression and association.
A 2019 study by the Ada Lovelace Institute (a research institute focusing on data and artificial intelligence) found that a majority of people are uncomfortable with FRT being used in schools. It reported that 67% of those surveyed were uncomfortable when thinking about its use in schools, compared to 61% when considering it on public transport and 29% who were uncomfortable with police forces using FRT systems.
Focusing further on the issue, the institute has commissioned Matthew Ryder QC to lead an independent review of the governance of biometric data. The review will examine any gaps in the existing regulatory framework and make recommendations for reform to ensure that biometric data, including facial characteristics, is governed “consistently with human rights, the public interest and public trust”. It is due to report its findings in autumn 2021.
How is the use of FRT governed?
The Centre for Data Ethics and Innovation (CDEI) has explained that the use of FRT is governed by several UK laws and regulations. It has highlighted that the Data Protection Act 2018, which implements GDPR, is applicable to all uses of FRT, both private and public, and is enforced and regulated by the ICO.
The CDEI has also noted that there are additional requirements for processing biometric data as the ICO considers it ‘special category data’. It must therefore meet one of the conditions in article 9 of the UK GDPR, together with any associated Data Protection Act schedule 1 conditions where required. Most of the conditions depend on the ability to demonstrate that the processing is ‘necessary’ for a specific purpose. Although this does not mean that it has to be absolutely essential, it must be “more than just useful or habitual”. It must also be “a targeted and proportionate way of achieving that purpose” as the condition does not apply if you could reasonably achieve the same purpose by some other less intrusive means, especially if that would avoid the use of special category data.
In addition, the ICO has said that children need particular protection when their data is being collected and processed as “they may be less aware of the risks involved”.
Calls for change
A 2019 House of Commons Science and Technology Committee report on the work of the Biometrics Commissioner and the Forensic Science Regulator argued that the Government should stop the use of facial recognition technology until a legislative framework had been introduced and there is better oversight and evaluation of its use. Responding in March 2021, the Government argued that there is “already a comprehensive legal framework for the management of biometrics, including facial recognition”.
More recently, in June 2021, the committee held a follow-up evidence session on biometrics and forensics. Following the session, the committee wrote to the Government stating that while it welcomed the progress made since its 2019 report, it had serious concerns about a number of issues, including biometric governance. However, most of the requests for further information focused on the use of biometric recognition technology by the police and LFR. In reply to the letter, the Government referred to its earlier response that there is a comprehensive legal framework in place “which we are taking measures to improve”.
Cover image by teguhjati pras on Pixabay.