The House of Lords is scheduled to debate the following motion on 2 February 2022:

Lord Clement-Jones (Liberal Democrat) to move that this House regrets the Surveillance Camera Code of Practice because (1) it does not constitute a legitimate legal or ethical framework for the police’s use of facial recognition technology, and (2) it is incompatible with human rights requirements surrounding such technology.

If agreed to, the regret motion would not compel the Government to withdraw the code. But it would formally record the House’s reservations.

Surveillance Camera Code of Practice

The ‘Surveillance Camera Code of Practice’ was first published in June 2013 under provisions in the Protection of Freedoms Act 2012. It provides guidance, including 12 guiding principles, on the appropriate use of surveillance camera systems by local authorities and the police. Under the 2012 act, these bodies “must have regard to the [code] when exercising any functions to which the code relates”.

The 2012 act permits updated versions of the code to be published if neither House of Parliament objects to the revisions within a set period. (Although the code is not a statutory instrument (SI), the parliamentary scrutiny process for revisions is therefore similar to the made negative procedure that applies to many SIs). Following a statutory consultation exercise, the Government laid an updated code before both Houses on 16 November 2021. Neither House formally objected to the revised code within the scrutiny period, so it came into effect on 12 January 2022.

In an explanatory memorandum published alongside the revised code, the Home Office explained that the updates made in this most recent revision were limited in scope. Changes were made mainly to reflect developments since the code was first published, including changes introduced by legislation such as the Data Protection Act 2018 and those necessitated by a Court of Appeal judgment on police use of live facial recognition issued in August 2020 (Bridges v South Wales Police [2020] EWCA Civ 1058). The text was also consolidated “to make it easier to read, but without affecting the substantive sections of the code”. The Government said it would consider separately several suggestions for more substantive revisions received during the statutory consultation exercise.

Reporting in December 2021, the House of Lords Secondary Legislation Scrutiny Committee noted that the revised code “reflects the [Court of Appeal] judgment by restricting the use of live facial recognition to places where the police have reasonable grounds to expect someone on a watchlist to be”. They added that the technology “cannot be used for ‘fishing expeditions’”. The committee continued:

The code now requires that if there is no suggested facial matches with the watchlist, the biometric data of members of the public filmed incidentally in the process should be deleted immediately. Because the technology is new, the revised code also emphasises the need to monitor its compliance with the public sector equality duty to ensure that the software does not contain unacceptable bias. We note that a variety of regulators are mentioned in the code and urge the authorities always to make clear to whom a person who objects to the surveillance can complain.

Facial recognition technology: legal, ethical and human rights considerations

There is disagreement between the Government and its critics on the extent to which the code forms part of a sufficient legal and ethical framework to regulate police use of facial recognition technology, and whether it is compatible with human rights requirements. Opponents of the use of such technology by the police, such as the civil liberties membership organisation Liberty, have argued that it threatens certain rights, including the right to respect for private life, and can also discriminate against people with certain protected characteristics.

In September 2019, the High Court ruled in a cooperative case involving Ed Bridges, a former Liberal Democrat councillor based in Cardiff, and South Wales Police (SWP) that police use of live automated facial recognition technology (known as LFR or AFR) was lawful. (See [2019] EWHC 2341 (Admin) and an accompanying summary). In particular, the court ruled that the code formed part of a legal framework that was “sufficient” to satisfy “in accordance with the law” requirements in several respects.

Mr Bridges, supported by Liberty, had sought a claim for judicial review on the basis that AFR was not compatible with the right to respect for private life under the European Convention on Human Rights and the Human Rights Act 1998; data protection legislation; and the public sector equality duty under the Equality Act 2010. However, the court said in its judgment:

We are satisfied both that the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate [a facial recognition tool], and that SWP’s use to date of AFR Locate has been consistent with the requirements of the Human Rights Act, and the data protection legislation.

The claimants appealed and the Court of Appeal later ruled, in August 2020, that SWP’s use of AFR had not in fact been in accordance with the law on several grounds (see [2020] EWCA Civ 1058), including in relation to certain convention rights; data protection legislation; and the public sector equality duty.

Interpretations of the Court of Appeal judgment’s implications for the continued use of facial recognition technology differ. For example, Liberty has cited the court’s finding that there were “fundamental deficiencies” in the legal framework then in place and argued that the judgment provides evidence that police use of facial recognition technology “breaches privacy rights, data protection laws and equality laws”. However, the Government takes a different view and argues the code in its revised form suffices in remedying the deficiencies to which the court referred. In November 2021, Minister of State at the Home Office Baroness Williams of Trafford said:

For a while now, there has been a question about LFR being used in a legal way. Noble Lords who are geeks on this subject will know about the Bridges v South Wales Police case. The Court of Appeal said that there was a legal framework for police use of LFR which allows its use for policing purpose and where it is necessary and proportionate. The framework includes police common law powers to prevent and deter crime, data protection, equality and human rights legislation, the Surveillance Camera Code of Practice, and forces’ own published policies. The appeal also confirmed that forces’ published policies need to provide more clarity about when they will use this technology and who they are looking for when they use it. The College of Policing has now completed its consultation on national guidance to address the gaps, which it is intended will be published early next year.

The Government suggests that the updated code, which applies at the national level in England and Wales, will help improve the consistency of guidance issued by police forces on the use of facial recognition technology. This is in addition to any alignment with new College of Policing guidance, expected to be published early in 2022.

Calls for change, including a temporary or permanent suspension of LFR

In July 2019, the House of Commons Science and Technology Committee published a report entitled ‘The Work of the Biometrics Commissioner and the Forensic Science Regulator’. In it, the committee criticised the slow progress of police forces in deleting custody images and repeated a call made in an earlier 2018 report that “automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved”. It continued:

We call on the Government to issue a moratorium on the current use of facial recognition technology and no further trials should take place until a legislative framework has been introduced and guidance on trial protocols, and an oversight and evaluation system, has been established.

The Government response, published 20 months later in March 2021, argued that there was “already a comprehensive legal framework for the management of biometrics, including facial recognition”. It described the landscape as follows:

This includes police common law powers to prevent and detect crime, the Data Protection Act 2018 (DPA), the Human Rights Act 1998, the Equality Act 2010, the Police and Criminal Evidence Act 1984 (PACE), the Protection of Freedoms Act 2012 (POFA), and police forces’ own published policies. In terms of oversight and regulation, the Information Commissioner’s Office (ICO) regulates compliance with the DPA, including police use and retention of biometrics and POFA created the Surveillance Camera Commissioner and Biometrics Commissioner roles, and the Forensic Information Databases Service strategy board, which oversees the police DNA and fingerprint databases. PACE also provides specific powers for police to collect DNA, fingerprints and custody images and sets out the data retention regime for DNA and fingerprints. There is also an agreed regime for the retention, review and deletion of custody images laid out in the College of Policing’s Authorised Professional Practice (APP) on the Management of Police Information.

The Government welcomed the Court of Appeal’s confirmation that there was an “existing legal framework made up of legislation and published local police policies” for the use of “biometric identification and overt surveillance, for a policing purpose and where necessary and proportionate”. The Government did however acknowledge that this was “complex for the police and public”, but added that it did “not agree that there should be a moratorium or an outright ban” on the use of facial recognition technology. In a subsequent letter to the chair of the committee, Greg Clark, the Minister of State for Crime and Policing, Kit Malthouse, reiterated the Government’s view that there “already is a comprehensive legal framework” for biometrics governance in place, which “we are taking measures to improve”.

In October 2019, former Information Commissioner Elizabeth Denham published an ‘opinion’ on the use of LFR technology by law enforcement in public places. In it, she expressed her view that the “combination of law and practice” cited by SWP in the initial judicial review case could be “made more clear, precise and foreseeable so that individuals can better understand when their biometric data may be processed by LFR”. She expanded on this point as follows:

The High Court, in Bridges v SWP, recognised that steps could and perhaps should be taken to further codify the relevant legal standards, and that the sufficiency of the legal regime would require periodic review to ensure it keeps pace with developments in the technology. It is the view of the commissioner that a statutory and binding code of practice, issued by government, should seek to address the specific issues arising from police use of LFR and, where possible, other new biometrics technologies. This would reflect developments in technology and should remain viable to deal with future technological changes in this area.

Such a code should provide greater clarity about proportionality considerations, given the privacy intrusion that arises as a result of the use of LFR, eg facial matching at scale. Without this, we are likely to continue to see inconsistency across police forces and other law enforcement organisations in terms of necessity and proportionality determinations relating to the processing of personal data. Such inconsistency, when left unchecked, will undermine public confidence in its use and lead to the law becoming less clear and predictable in the public’s mind. In the event that more police forces or law enforcement organisations seek to trial the technology, or indeed opt to use it as part of standard operations, the more likely we are to see inconsistency and compliance failures. In the Commissioner’s view, this code should therefore be considered by government at the earliest opportunity.

In June 2021, Ms Denham issued another opinion on the use of LFR technology in public places. In an accompanying blog post, she described this as setting out the “rules of engagement”, building on the earlier opinion and setting a high threshold for use of the technology. She added:

Organisations will need to demonstrate high standards of governance and accountability from the outset, including being able to justify that the use of LFR is fair, necessary and proportionate in each specific context in which it is deployed. They need to demonstrate that less intrusive techniques won’t work. These are important standards that require robust assessment. Organisations will also need to understand and assess the risks of using a potentially intrusive technology and its impact on people’s privacy and their lives. For example, how issues around accuracy and bias could lead to misidentification and the damage or detriment that comes with that.

In December 2020, following the Court of Appeal judgment, former Surveillance Camera Commissioner Antony Porter reissued best practice guidance for the police on the use of facial recognition technology. In it, he argued that the legal framework governing the use of LFR could be clearer. In a foreword to the updated guidance, Mr Porter wrote that police use of LFR technology “serves to illustrate a much wider challenge facing society […] namely the creation and modernisation of rules and safeguards which inform, constrain and hold to account the overt use of increasingly sophisticated and available surveillance technologies by law enforcement agencies”. He added that the technology “unquestionably has the potential to help the police keep communities safe from harm” but cautioned that its use was “intrusive”. He concluded that he believed there remained a “degree of opaqueness as to the current legal framework which accommodates LFR use by the police” which would “benefit from a revision of legislation, the secretary of state’s Surveillance Camera Code of Practice and regulatory safeguards”.

In December 2020, the Biometrics and Forensics Ethics Group (BFEG), an advisory non-departmental public body sponsored by the Home Office, updated its ‘ethical principles’ to explicitly state that “procedures should not deliberately or inadvertently target or selectively disadvantage people or groups on the basis of ‘protected characteristics’ as defined in the Equality Act 2010”. It said the update also included adding that “procedures should respect, without discrimination, human rights as defined in the Human Rights Act 1998”.

The following month, the BFEG published a briefing note on the ethical issues arising from public-private collaboration (P-PC) in the use of LFR technology, including between police and private organisations. In a summary conclusion, the group said:

In the absence of regulation, the working group outlined a number of issues that should be addressed prior to setting up of P–PCs in the use of LFR. The [Live Facial Recognition Working Group] also made a number of recommendations that should be followed by those involved in P–PCs, including that an independent ethics group should have oversight of the use of LFR by police forces and in P–PCs.

Other bodies, such as the Ada Lovelace Institute, have described the UK as “not ready for facial recognition technology”. For example, the institute has called for a “voluntary moratorium by all those selling and using facial recognition technology [to] enable a more informed conversation with the public about limitations and appropriate safeguards”. Groups such as Big Brother Watch have said the technology “needs to be stopped”.

The House of Lords Justice and Home Affairs Committee is currently undertaking an inquiry into new technologies and the application of the law. The inquiry is no longer accepting evidence but has not yet reported.

Read more

Cover image by teguhjati pras on Pixabay.