Live facial recognition (LFR) utilises technology to match faces captured on near real-time video images. These faces are then matched against a watchlist of individuals provided by the police. Opponents of police use of LFR have argued it threatens human rights, specifically the right to respect for private life.

In August 2020, the court of appeal found that the use of facial recognition technology by South Wales police had been unlawful. This followed an unsuccessful judicial review application to the high court in 2019. The court of appeal found there was no clear guidance on its use of LFR and that South Wales police did not take reasonable steps to ensure the technology did not have a gender or racial bias. However, it did not outlaw the use of the technology, stating that the benefits could outweigh the human rights concerns.

In November 2020, following the judgment, the Surveillance Camera Commissioner issued an updated version of the best practice guidance on the use of facial recognition technology. In his foreword to this updated guidance, he argued the police should be able to use facial recognition technology in “appropriate circumstances”. However, he argued “there remains a degree of opaqueness” regarding the legal framework governing its use.

There have been calls for a moratorium on the use of facial recognition technology by the police and other public agencies. In its 2019 report on the work of the biometrics commissioner and the forensic science regulator, the House of Commons Science and Technology Committee argued that use of facial recognition technology should halt until a legislative framework has been introduced and there is better oversight of its use. In its response to the report, the Government argued there were already adequate legal protections in place concerning the use of facial recognition technology.

Concerns raised in the House of Lords

In March 2020, Lord Strasburger (Liberal Democrat) tabled an oral question in the House of Lords on the use of facial recognition technology by the Metropolitan Police. Lord Foster of Bath (Liberal Democrat), who asked the question on Lord Strasburger’s behalf, claimed the Government were ignoring a rapid expansion in the use of facial recognition technology. He argued that the UK was becoming a “surveillance society”. In her response, Baroness Williams of Trafford, Minister of State at the Home Office said the police were operating within the legal framework for the technology’s use and that this had been recognised by the high court (this was prior to the court of appeal judgment).

On 12 April 2021, the use of facial recognition technology will be the subject of the following oral question in the House of Lords:

Lord Clement-Jones (Liberal Democrat) to ask Her Majesty’s Government what assessment they have made of (1) the Council of Europe consultative committee’s Guidelines on Facial Recognition, published on 28 January, and (2) the Biometrics and Forensics Ethics Group’s Briefing Note on the Ethical Issues Arising from Public–Private Collaboration in the Use of Live Facial Recognition Technology, published on 21 January; and what (a) legislative, or (b) regulatory, changes they intend to make as a result.

The use of facial recognition technology was also the subject of a question for short debate in 2018, tabled by Baroness Jones of Moulsecoomb (Green Party).

Other commentary and guidelines

Council of Europe: guidelines on facial recognition

In January 2021, the Council of Europe consultative committee of the convention for the protection of individuals with regard to automatic processing of personal data published its guidelines on facial recognition. The guidelines address the use of facial recognition technology by both governments and the private sector.

In the press release accompanying the publication of these guidelines, the Council of Europe argued “strict rules” were necessary to ensure that facial recognition technology was used in a way that avoided unnecessary risks to privacy and ensured that individuals’ data was protected. It also said certain uses of facial recognition technology should be banned, including for the sole purpose of determining “a person’s skin colour, religious or other belief, sex, racial or ethnic origin, age, health or social status”.

Biometrics and Forensics Ethics Group: public–private collaboration in use of live facial recognition technology

The Biometrics and Forensics Ethics Group (BFEG) previously published a briefing on the ethical issues arising from the police use of LFR in 2019. The BFEG is an advisory body sponsored by the Home Office.

Its briefing raised concerns about the risk of error and biases arising from this technology. Specifically, it highlighted a particular concern about the risk of discrimination on grounds of race, ethnicity and gender:

Biometric technologies for facial recognition require machine-learning algorithms that have been trained on a dataset of labelled images. The system can only ‘recognise’ faces within the parameters of the data that it has been trained on and previously exposed to. If certain types of faces (for example, Black, Asian and Ethnic Minority faces or female faces) are under-represented in LFR training datasets, then this bias will feed forward into the use of the technology by human operators.

Its 2019 report also argued there was a lack of independent oversight and governance regarding the use of LFR.

The BFEG’s briefing note on the ethical issues arising from public–private collaboration in the use of LFR technology was published in January 2021. It cited several instances where the police had worked with the private sector to enable the use of LFR in public places, including in Manchester, South Yorkshire and London. The BFEG argued the use of biometric recognition technologies, including LFR, was likely to increase. It recommended there should be more transparency about how the data collected was shared with the private sector. This included information gathered which might be used to develop machine learning.

It also noted that the 2020 judgment from the court of appeal had found the police had too broad a discretion when deciding who should be included on the watch lists used for LFR.

The BFEG argued that, before the police decide to collaborate with the private sector on the use of LFR, they must:

  • demonstrate that collaboration is necessary;
  • demonstrate that the data sharing required in the collaboration is proportionate; and
  • define what types of data are being shared.

It also made several recommendations about the use of LFR, including that the watchlists provided by the police should be “narrow and targeted”. In addition, it argued that an independent ethics group should oversee the use of LFR.

Read more

Cover image from pixabay.