Table of contents
- 1. UN practical toolkit for law enforcement officials to promote and protect human rights in the context of peaceful protests skip to link
- 2. Concerns about UK use of surveillance technologies skip to link
- 3. Police standards for the use of live facial recognition at protests skip to link
- 4. Government policy skip to link
- 5. Scrutiny by the House of Lords Justice and Home Affairs Committee skip to link
- 6. Read more skip to link
The House of Lords is due to consider the following question for short debate on 25 April 2024:
Baroness Jones of Moulsecoomb (Green) to ask His Majesty’s Government what assessment they have made of the ‘Practical toolkit for law enforcement officials to promote and protect human rights in the context of peaceful protests’, published on 7 March by the UN special rapporteur on the rights to freedom of peaceful assembly and association, and how they intend to ensure that the United Kingdom aligns with United Nations standards on the use of surveillance technology at protests.
1. UN practical toolkit for law enforcement officials to promote and protect human rights in the context of peaceful protests
1.1 Background
The UN special rapporteur on the rights to freedom of peaceful assembly and of association published a ‘Practical toolkit for law enforcement officials to promote and protect human rights in the context of peaceful protests’ on 7 March 2024.[1] It was developed in collaboration with the UN Office on Drugs and Crime (UNODC) and the Office of the UN High Commissioner for Human Rights (OHCHR). The aim of the toolkit is to “enhance the capacity and practices of law enforcement agencies to fulfil their duty to promote and protect human rights in the context of peaceful protests”, based on international human rights law, standards and good practices.[2]
The toolkit was produced following the adoption by the UN’s Human Rights Council in July 2022 of a resolution that (among other things) requested the special rapporteur to develop “specific technical and practical tools based on international standards and best practices to assist law enforcement officials in promoting and protecting human rights in the context of peaceful protests”.[3] The resolution covered a variety of topics relating to peaceful protests. This included expressing concerns about technology such as CCTV, aerial surveillance vehicles, facial recognition and international mobile identity catchers (‘stingrays’) being used for “arbitrary and unlawful surveillance” of people engaged in peaceful protests. It called on states to refrain from using digital technology for unlawful surveillance and biometric identification technologies for “arbitrary or unlawful” identification of those peacefully participating in an assembly.
The UN Human Rights Council is an intergovernmental body within the United Nations system that is responsible for human rights.[4] It is composed of 47 member states elected for three-year terms. The UK’s most recent term ended in 2023. [5]
1.2 What does the toolkit say?
The toolkit consists of a model protocol and three supplementary components:
- Model protocol: The ‘Model protocol for law enforcement officials to promote and protect human rights in the context of peaceful protests’ was published on 31 January 2024.
- Component 1: Action-oriented checklists for law enforcement officials. The checklists are intended to provide “practical measures law enforcement should take to ensure that the requirements of the protocol are fulfilled”. The checklists have not yet been published, but the UN website says they are “coming soon”.[6]
- Component 2: A principled-based guidance for the human-rights compliant use of digital technologies in the context of peaceful protests. This is intended to provide “practical guidance and foundational principles” for law enforcement. The guidance, entitled ‘Human rights compliant uses of digital technologies by law enforcement for the facilitation of peaceful protests’, was published on 7 March 2024.
- Component 3: A handbook for law enforcement. This is intended to help law enforcement “implement and operationalise the provisions of the model protocol”. It has not yet been published but is due to be finalised by the end of 2024.[7]
The model protocol sets out principles relating to the effective facilitation of peaceful protests, including general principles and norms; broader human rights policing principles relating to communication, training and accountability; and specific principles to follow before, during and after a protest. Some of its provisions relate specifically to surveillance and the use of technology around protests. One section sets out key principles for the use of digital technologies in the facilitation of protests, as follows:
Any use of digital technologies to facilitate a protest should be solely aimed at enabling the right to freedom of peaceful assembly. Protests should not be seen as opportunities for surveillance or the pursuit of broader law enforcement objectives through the use of digital technologies.
Legal frameworks related to digital technologies conforming to international human rights law and standards, including data protection laws and robust regulation and oversight mechanisms, must be established and supported by practical guidance. The acquisition and use of any digital technologies in the context of protests must meet the requirements of legality, necessity and proportionality. This must be demonstrated effectively and supported by appropriate evidence.
Internet shutdowns, surveillance on the basis of group affiliation and the targeted use of spyware in the context of protests are actions that are incompatible with international human rights law and must not be used.
Other headings in the model protocol also contain guidance relevant to the use of digital technology and surveillance. For example, under training, the model protocol says that law enforcement officials should be trained on the human rights implications of any digital technologies used in the context of protests. Under accountability, the model protocol says that oversight mechanisms should incorporate appropriate human rights safeguards, such as a “transparent and auditable record […] of all pertinent decision-making concerning digital technologies”, and restrictions and procedures around data retention.
In relation to law enforcement planning before a protest takes place, the model protocol says that human rights implications, including “less visible impacts, such as the creation of any chilling effects” of digital technologies should be assessed. It states that the overall approach should be “premised on a limiting principle to circumscribe the use of such technologies, rather than an authorising principle intended to expand their use”. Any decisions to use digital technologies for information-gathering should be “made on a case-by-case basis and aligned with the specific law enforcement objectives and circumstances”. The threshold for their use should be high, and “less intrusive techniques” should be used wherever possible. Digital technologies being used to assess the evolving situation for the purposes of facilitating a protest should not lead to “intelligence-gathering with regard to peaceful protests for unrelated law enforcement objectives”.
The model protocol also sets out principles relating to evidence-based risk assessment by law enforcement during a protest. It acknowledges that “ongoing human-rights compliant collection of information can help law enforcement officials to act in a timely manner to prevent violence or escalation and to distinguish between peaceful protestors and individuals who are involved in violence”. At the same time, it says that law enforcement officials should deploy the least intrusive methods of gathering information and ensure that any decision to use surveillance strategies and/or record participants is “exceptional and limited to distinct law enforcement purposes, such as directly aiding the prevention, investigation and prosecution of a specific criminal offence occurring within the context of the protests”. It stipulates that facial recognition technologies and other biometric systems “must not be utilised to identify individuals who are peacefully participating in a protest”.
Component 2 of the toolkit, the ‘Human rights compliant uses of digital technologies by law enforcement for the facilitation of peaceful protests’, reiterates and in some cases expands on the principles set out in the model protocol. It covers general principles, guidance on taking a human-rights based approach to using digital technologies, guidance on their use before, during and after a protest, and provisions on oversight and accountability. The guidance suggests ways that digital technologies could be used to help law enforcement officials carry out their obligations to facilitate the right to peaceful assembly, such as estimating the number of attendees, crowd densities or the likely route of a protest. Equally it could be used to help law enforcement officials facilitate peaceful protest by helping them respond to emerging security threats, for instance by enabling the separation of specific participants directly engaged in or threatening violence at an otherwise peaceful protest. However, the guidance emphasises that “a bright line distinction must be made” between the use of digital technologies for those purposes and recording or surveillance of protest participants. It states that “any decision to record, process or retain information, should be exceptional, subject to a high level of justification, stringent authorisation, and based exclusively on the obligation to ensure accountability”.
2. Concerns about UK use of surveillance technologies
The UN Human Rights Committee recently expressed concerns about the use of surveillance technology in the UK. The Human Rights Committee—a different body from the Human Rights Council—is a body of independent experts that monitors implementation of the International Covenant on Civil and Political Rights by states that are party to the covenant.[8] In its concluding observations on the latest periodic report from the UK, the committee said in March 2024 it was “concerned about the increased use by police forces of facial recognition technology to monitor peaceful gatherings”.[9] It said the UK should “end the use of facial recognition and other mass surveillance technologies by law enforcement agencies at protests, to safeguard privacy, non-discrimination, freedom of expression, association, and assembly rights for protestors”.
Human rights and privacy groups had raised their concerns to the UN Human Rights Committee as part of the review process. For instance, Human Rights Watch submitted that:
UK police forces are increasingly using live facial recognition technology to monitor peaceful gatherings, from Notting Hill Carnival to the coronation of King Charles III. Northamptonshire Police used live facial recognition technology at the 2023 Formula 1 grand prix, seemingly in an attempt to deter and monitor environmental protests at the event. Facial recognition surveillance technology is simply not compatible with human rights, and Human Rights Watch calls for a ban on its use in public spaces, as it undermines privacy rights, and poses a serious risk to non-discrimination, freedom of expression, assembly and association rights.[10]
Live facial recognition is a real-time deployment of facial recognition technology.[11] Live facial recognition uses live video footage of crowds passing a camera and compares their images to those on a predetermined watchlist of people of interest to the police, generating an alert when a possible match is found. Following an alert, a police officer then decides what action, if any, to take.
Similarly to Human Rights Watch, Privacy International called for the Human Rights Committee to ask the UK to “halt and ban the use of live facial recognition technology and ensure that any power to undertake targeted surveillance during protest is transparently regulated and adheres to requirements under international human rights law”.[12] Privacy International maintained the police “continue to wrongly justify using FRT [facial recognition technology] through a patchwork of legislation, relying on their common law policing powers and data protection legislation as sufficient in regulating its use”.
Civil liberties organisation Liberty has described live facial recognition by the police as a “gross violation of your human rights”.[13] It said that the ‘maps’ that facial recognition technology creates of an individual’s face are identifiable biometric data, like a fingerprint, and that when live facial recognition cameras scan everyone in sight, “this data is being snatched from you without your knowledge or consent”. Big Brother Watch is also campaigning for an end to the live facial recognition surveillance of public spaces by police and private companies in the UK.[14]
3. Police standards for the use of live facial recognition at protests
3.1 Review by HM Inspectorate of Constabulary and Fire and Rescue Services
HM Inspectorate of Constabulary and Fire and Rescue Services (HMICFRS) published an inspection report in March 2021 looking at how effectively the police deal with protests.[15] It noted that the use of facial recognition technology “divides opinion: opponents point to its potential to violate human rights, while supporters believe it could help the police to identify those intent on committing crime or causing significant disruption and disorder”.
It identified the police’s use of live facial recognition technology as an area for improvement and recommended that further work was needed in this area:
The National Police Chiefs’ Council should continue to work with the government and other interested parties. These bodies should develop a robust framework that supports forces, allowing the use of live facial recognition in a way that improves police efficiency and effectiveness while addressing public concerns about the use of such technology. The framework should be designed to help the police satisfy the requirements explained in the Court of Appeal judgment [2020] EWCA Civ 1058.[16]
3.2 Court of Appeal judgment on use of automatic facial recognition
The mention by HMICFRS of a 2020 Court of Appeal judgment is a reference to the Bridges case.[17] Edward Bridges, a civil liberties campaigner, was in the vicinity of two deployments of automated facial recognition (AFR) technology by South Wales Police, first in December 2017 in Cardiff city centre and second in March 2018 at a protest outside a defence technology exhibition.[18] Mr Bridges was not included on a South Wales Police watchlist for its AFR deployments, but he contended that his image was recorded by the AFR system, even if deleted almost immediately after. South Wales Police did not contest this. Mr Bridges, supported by Liberty, brought a claim for judicial review on the basis that AFR was not compatible with the right to respect for private life under article 8 of the European Convention on Human Rights (ECHR), data protection legislation and the public sector equality duty under the Equality Act 2010.[19]
The Court of Appeal held that:[20]
- South Wales Police’s use of live AFR was not in accordance with the law for the purposes of article 8(2) of the ECHR. The court held that the legal framework (consisting of primary legislation (the Data Protection Act 2018), secondary legislation (the ‘Surveillance camera code of practice’) and South Wales Police local policies) gave no clear guidance on where AFR Locate, South Wales Police’s live AFR system, could be used and who could be put on a watchlist. The court held this was “too broad a discretion to afford to the police officers to meet the standard required” by article 8(2) of the EHCR. Article 8(2) allows public authorities to interfere with an individual’s right to respect or private and family life only where it is “in accordance with the law” and is necessary for certain purposes.
- South Wales Police did not comply with the public sector equality duty in its use of live AFR. This was because it had not taken reasonable steps to enquire about whether the AFR Locate software had bias on racial or sex grounds. The court did note, however, that there was no clear evidence that AFR locate software was in fact biased on grounds of sex or race.
3.4 Police authorised professional practice and operational advice
The College of Policing (the professional body for policing) and the National Police Chiefs’ Council updated their ‘National protest operational advice’ document in August 2023 to reflect the improvements to the policing of protest activities that were suggested in the HMICFRS report.[21] This advice is intended to address “protest activity in a democratic society, and the role of the police in balancing the rights of those involved with protest with the rights of those affected”.
On the use of surveillance technologies, the guidance covers the use of evidence-gathering teams and image retention. It notes that police gathering and storing the data of people attending protests will interfere with their rights under article 8 of the ECHR and therefore the way this data is retained and deleted “is fundamental to avoid breaching their article 8 rights”.[22] The guidance states that “in particular, sensitive data without a legitimate purpose must be avoided”. It states that the use of body-worn video cameras can supplement the use of evidence-gathering teams (for instance, in providing evidence in relation to the conduct of stakeholders and the police during a protest) but should not replace them. However, the ‘National protest operational advice’ specifically does not cover the use of automatic facial recognition. It states that the use of this technology at public events, including protests, “is still at an early stage and falls outside the scope of this document”.[23]
The College of Policing also produces authorised professional practice (APP), which it describes as the official and most up-to-date source of policing practice.[24] The College of Policing states that police officers and staff are expected to have regard to APP in discharging their responsibilities, although there may be circumstances where there is a legitimate operational reason for a force to deviate from APP, providing there is a clear rationale for doing so.
The overview page of the APP on public order public safety (POPS) states that POPS covers events and operations, including protests, where there is a reasonably foreseeable risk to public order and/or public safety, and that the POPS APP is aimed at those involved in the planning, policing and command of such events.[25] However, the thematic pages of the POPS APP state that it is not a protest advice document, and POPS commanders should refer to the ‘National protest operational advice’ document when planning for the policing of protests.[26] The POPS APP also states that when using live facial recognition at a POPS event, police forces should ensure they comply with the requirements set out in the live facial recognition APP.[27]
The APP on live facial recognition was first published in March 2022; parts of it have subsequently been updated.[28] It covers the use of live facial recognition in general, not just at protests. However, it does state that when reviewing a potential deployment location, authorising officers (AOs) should consider the reasonable expectations of privacy that the general public may have, whether some locations attract greater privacy expectations than others, and whether the use of live facial recognition in some locations might make the public feel less able to express their views or more reluctant to be in the area.[29] Assemblies and demonstrations are suggested as locations where this could apply. The APP says that where privacy or human rights considerations are identified in relation to a particular deployment, the AO needs to consider the necessity of using the technology in that particular location, and whether the aims being pursued could be similarly achieved elsewhere. Where that location is necessary, and the processing of data at that site is strictly necessary, AOs need to identify any mitigations they could take and weigh the rights of those engaged by the live facial recognition system against the likely benefits of using it to ensure the use is not disproportionate.
The APP sets out general guidance for the overt deployment of live facial recognition technology to locate people on a watchlist. It covers:
- the criteria for putting someone on a watchlist
- the types of images that can be used on the watchlist
- considerations around the date, time, duration and location of deployment, including informing individuals in advance of their entering the zone of recognition
- minimum requirements and additional metrics that are relevant and suitable for collation and analysis for operational deployment of live facial recognition technology
Images that may be deemed appropriate for inclusion on a watchlist are those of people who are:[30]
- wanted by the courts
- suspected of having committed an offence, or where there are reasonable grounds to suspect that the individual depicted is about to commit an offence, or where there are reasonable grounds to suspect an individual depicted to be committing an offence
- subject to bail conditions, court order or other restriction that would be breached if they were at the location at the time of the deployment
- missing persons deemed at increased risk of harm
- presenting a risk of harm to themselves or others
- a victim of an offence or a person who the police have reasonable grounds to suspect would have information of importance and relevance to progress an investigation, or who is otherwise a close associate of an individual and that individual would fall within people wanted by the courts and presenting a risk of harm to themselves or others
Images that were not originated by the police could only be included on the watchlist with the authorisation of the AO.
The APP states that it “gives direction to forces that will enable them to ensure that their deployment of LFR is in compliance with applicable legal requirements”.[31] The College of Policing says the APP has been written taking into account the Bridges judgment, and that it also pays regard to the opinions, guidance and other documentation issued by the Surveillance Camera Commissioner (now the Biometrics and Surveillance Camera Commissioner) and the Information Commissioner.[32]
4. Government policy
Chris Philp, the minister for crime, policing and fire, wrote to chief constables and police and crime commissioners in October 2023 to express the government’s support for “developing facial recognition as a crime-fighting tool”.[33] He said he was “very supportive of the use of live—or active—facial recognition (LFR) to deter and detect crime in public settings that attract large crowds”. He mentioned a football match as an example of this but did not mention protests specifically. Addressing possible concerns around the technology, Mr Philp said:
There is College of Policing authorised professional practice in place and a sound legal basis for LFR. Recent testing by the National Physical Laboratory has provided the necessary assurance about accuracy and the absence of gender or racial bias in the algorithms and at the settings the Met and South Wales police have been using, and the immediate deletion of non-matched biometric data addresses privacy concerns.
The Metropolitan Police and South Wales Police are currently the only forces to have their own LFR technology, but South Wales Police has authorised the deployment of its equipment to Northamptonshire Police and Essex Police.[34]
Mr Philp said that recent deployments of LFR had “led to arrests that would otherwise have been impossible” as “no number of officers could have picked those people out of a crowd”. He said the technology was very accurate, and there had been no false alerts.[35] He argued it “has great potential to pick up wanted persons who would otherwise go undetected, and to protect public events from specific threats”.
The government has said that it recognises “the importance of ensuring that LFR is used appropriately and that there are safeguards in place to ensure this”.[36] It added that its use is “governed by data protection, equality, and human rights laws, and can only be used for a policing purpose when necessary, proportionate and fair”. It maintains that the APP on LFR has addressed the two areas of the law the Court of Appeal judgment said needed to be clarified, namely “what categories of people could be included in watchlists and under what circumstances could the technology be used”.
According to the Times, the government is planning to make a policy statement setting out its facial recognition strategy for policing in May or June.[37]
The Home Office has published statutory guidance on the appropriate and effective use of surveillance camera systems in public places by relevant authorities—including chief police officers and police and crime commissioners—in its ‘Surveillance camera code of practice’. The code sets out 12 guiding principles which are intended to “draw together good practice and existing legal obligations to create a regulatory framework which can be understood by system operators and the public alike”.[38] The code is not solely concerned with the use of facial recognition technology in surveillance camera systems, but it does make some reference to it. For instance, it states that:
Any use of facial recognition or other biometric characteristic recognition systems needs to be clearly justified and proportionate in meeting the stated purpose, and be suitably validated. It should always involve human intervention before decisions are taken that affect an individual adversely.[39]
It also sets out procedures chief police officers should follow when using a surveillance camera system for live facial recognition purposes to find people on a watchlist.[40]
5. Scrutiny by the House of Lords Justice and Home Affairs Committee
The House of Lords Justice and Home Affairs Committee is “deeply concerned” that the use of LFR by the police “is being expanded without proper scrutiny and accountability”.[41] In a letter to the home secretary in January 2024, the committee said one of its main concerns was “the absence of a foundation in law for the deployment of LFR”. The committee noted that both privacy campaigners and the Metropolitan Police agreed there was no specific legislative authority. The committee said it was “concerned that the findings of the Bridges case were specific to that case and that case cannot be understood as a clear basis for the use of LFR”.[42] The committee recommended that “as well as a clear, and clearly understood, legal foundation, there should be a legislative framework, authorised by Parliament for the regulation of the deployment of LFR technology”.[43]
It also called for clearer standards and regulation on the use of LFR, such as a national compulsory training programme and standards for England and Wales, to which all police forces would have to adhere, and specific compulsory statutory criteria governing who could be put on a watchlist.[44] The committee also recommended the publication of “national regulation, or at least guidelines, kept under review, on how extensive crowd-scanning activity is being assessed with relation to its lawfulness, necessity, and proportionality, before and after the deployment of LFR”.[45]
The committee’s letter did not specifically mention protests. However, as part of an earlier, wider, inquiry into the use of new technologies in the justice system, the committee had heard evidence that some people would not attend protests if they knew that facial recognition was going to be used.[46] A group of academics had suggested to the committee that it could have a possible “chilling effect” on “public assemblies, freedom of expression, and the general use of public space by certain communities and demographics”.
The government responded to the committee’s recent inquiry in March 2024.[47] It maintained that “there is a comprehensive legal framework governing police use of LFR”.[48] It described this as being made up of police common law powers to prevent and detect crime and bring offenders to justice, the Data Protection Act 2018, the Human Rights Act 1998, the Equality Act 2010, the Police and Criminal Evidence Act 1984, the College of Policing’s APP on LFR, and published police policies. The government said it believed that “principles-based primary legislation, supplemented by specific laws aligned to those principles, is the right approach to ensure that the legal framework can keep up with rapidly developing technology like LFR”.[49]
On the question of national training and standards, the government noted that the APP on LFR sets out the importance of providing training to officers using LFR technology.[50] It said that to date, only four of the 43 police forces in England and Wales had used LFR, but as its use increased, the National Police Chiefs’ Council and the College of Policing would consider the training and standards needed. The government said the APP already set out the categories of people who could be included on a watchlist and the necessary authorisation required. In response to the recommendation about national regulation on how the lawfulness, necessity and proportionality of LFR was being assessed, the government repeated that “there is a comprehensive legal framework for police of LFR and it includes numerous safeguards”.
6. Read more
- Information Commissioner’s Office, ‘Information Commissioner’s opinion: The use of live facial recognition technology by law enforcement in public places’, 31 October 2019
- Surveillance Camera Commissioner, ‘Facing the camera: Good practice and guidance for the police use of overt surveillance camera systems incorporating facial recognition technology to locate persons on a watchlist, in public places in England and Wales’, November 2020
- Surveillance Camera Commissioner’s Office, ‘The commissioner discusses the new era for live facial recognition after the coronation’, 17 May 2023
- Home Office, ‘Surveillance camera code of practice’, amended November 2021
- Metropolitan Police, ‘Live facial recognition: Legal mandate’, March 2024
- Home Office, ‘Police use of facial recognition: Factsheet’, 29 October 2023
- House of Lords Library, ‘Facial recognition technology: Police powers and the protection of privacy’, 31 March 2021
- House of Lords Library, ‘Surveillance camera code of practice: Regret motion’, 31 January 2022
- House of Lords Library, ‘AI technology and the justice system: Lords committee report’, 23 November 2022
- Parliamentary Office of Science and Technology (POST), ‘AI in policing and security’, 29 April 2021
- House of Commons Library, ‘Police powers: Protests’, 3 August 2023
Cover image by teguhjatipras on Pixabay.
References
- The special rapporteur on the rights to freedom of peaceful assembly and of association who drew up the toolkit was Clement Nyaletsossi Voule from Togo. A new mandate holder, Gina Paola Romero Rodriguez from Colombia, was appointed at the Human Rights Council’s regular session in April 2024 (United Nations Human Rights Council, ‘Human Rights Council adopts decision on remote participation modalities for hybrid meetings, appoints 14 special procedure mandate holders, and concludes fifty-fifth regular session’, 4 April 2024). Return to text
- United Nations, ‘Practical toolkit for law enforcement officials to promote and protect human rights in the context of peaceful protests’, 7 March 2024. Return to text
- United Nations General Assembly, ‘Resolution adopted by the Human Rights Council on 8 July 2022’, A/HRC/RES/50/21, 14 July 2022. Return to text
- United Nations Human Rights Council, ‘Welcome to the Human Rights Council’, accessed 3 April 2024. Return to text
- United Nations Human Rights Council, ‘Membership of the Human Rights Council, 1 January–31 December 2023 by regional groups’, accessed 3 April 2024. Return to text
- United Nations, ‘Practical toolkit for law enforcement officials to promote and protect human rights in the context of peaceful protests’, 7 March 2024. Return to text
- United Nations General Assembly, ‘Model protocol for law enforcement officials to promote and protect human rights in the context of peaceful protests’, 31 January 2024, A/HRC/55/60, p 2 (see footnote 2). Return to text
- United Nations, ‘Human Rights Committee’, accessed 16 April 2024. Return to text
- Human Rights Committee, ‘Concluding observations on the eighth periodic report of United Kingdom of Great Britain and Northern Ireland’, (advance unedited version), 28 March 2024, p 12. Return to text
- Human Rights Watch, ‘Human Rights Watch submission to the UN Human Rights Committee review of the United Kingdom’, February 2024. Return to text
- College of Policing, ‘Live facial recognition’, 22 March 2022; and Home Office, ‘Police use of facial recognition: Factsheet’, 29 October 2023. Return to text
- Privacy International, ‘Privacy International’s submission in advance of consideration of the eighth periodic report of the United Kingdom, Human Rights Committee, 140th session, March 2024’, February 2024, p 10. Return to text
- Liberty, ‘What is police facial recognition technology and how do we stop it?’, 11 August 2022. Return to text
- Big Brother Watch, ‘Stop facial recognition’, accessed 11 April 2024. Return to text
- HM Inspectorate of Constabulary and Fire and Rescue Services, ‘Getting the balance right? An inspection of how effectively the police deal with protests’, March 2021. Return to text
- As above, p 8. Return to text
- R (Bridges) v CC South Wales [2020] EWCA Civ 1058. Return to text
- Courts and Tribunals Judiciary, ‘Press summary: The Queen (on the application of Edward Bridges) (appellant) v the chief constable of South Wales Police (respondent) and others [2020] EWCA Civ 1058’, 11 August 2020. Return to text
- As above; and Liberty, ‘Liberty wins groundbreaking victory against facial recognition tech’, 11 August 2020. Section 149 of the Equality Act 2010 requires public authorities to have due regard to the need to eliminate discrimination, harassment and victimisation and to advance equality of opportunity between people who share a relevant protected characteristic and those who do not. Return to text
- Courts and Tribunals Judiciary, ‘Press summary: The Queen (on the application of Edward Bridges) (appellant) v the chief constable of South Wales Police (respondent) and others [2020] EWCA Civ 1058’, 11 August 2020; and ‘Order (Case ref: C1/2019/2670)’, 11 August 2020. Return to text
- College of Policing and National Police Chiefs’ Council, ‘National protest operational advice’, August 2023, p 5. Return to text
- As above, pp 39–40. Return to text
- As above, p 40. Return to text
- College of Policing, ‘Using APP’, 8 August 2022. Return to text
- College of Policing, ‘Public order public safety: Overview’, 23 October 2013. Return to text
- College of Policing, ‘Public order public safety: Thematic guidance and further information—protests’, 8 June 2023. Return to text
- College of Policing, ‘Public order public safety: Developing a policing plan—intelligence’, 8 June 2023. Return to text
- College of Policing, ‘Live facial recognition: Overview’, 22 March 2022. Return to text
- College of Policing, ‘Live facial recognition: Where—date, time, duration and location of deployment’, 27 July 2023. Return to text
- College of Policing, ‘Live facial recognition: Watchlist’, 22 March 2022. Return to text
- College of Policing, ‘Live facial recognition: Live facial recognition’, 22 March 2022. Return to text
- As above. See also: Information Commissioner’s Office, ‘Information Commissioner’s opinion: The use of live facial recognition technology by law enforcement in public places’, 31 October 2019; and Surveillance Camera Commissioner, ‘Facing the camera: Good practice and guidance for the police use of overt surveillance camera systems incorporating facial recognition technology to locate persons on a watchlist, in public places in England and Wales’, November 2020. Return to text
- Home Office, ‘Letter to police on AI-enabled facial recognition searches’, 29 October 2023. Return to text
- House of Lords Justice and Home Affairs Committee, ‘Letter from Baroness Hamwee, chair of the Justice and Home Affairs Committee, to James Cleverly MP, home secretary, regarding the use of live facial recognition by police forces’, 26 January 2024. Return to text
- Home Office, ‘Letter to police on AI-enabled facial recognition searches’, 29 October 2023. Return to text
- House of Lords Justice and Home Affairs Committee, ‘Government response to the Justice and Home Affairs Committee’s letter on live facial recognition (LFR) technology’, 25 March 2024, p 1. Return to text
- Ben Ellery and George Willoughby, ‘How facial recognition technology has changed policing’, Times (£), 5 April 2024. Return to text
- Home Office, ‘Surveillance camera code of practice’, amended November 2021, p 8. Return to text
- As above, p 11. Return to text
- As above, p 18. Return to text
- House of Lords Justice and Home Affairs Committee, ‘Letter from Baroness Hamwee, chair of the Justice and Home Affairs Committee, to James Cleverly MP, home secretary, regarding the use of live facial recognition by police forces’, 26 January 2024, p 2. Return to text
- As above, p 3. Return to text
- As above, p 14. Return to text
- As above, pp 6–7. Return to text
- As above, p 7. Return to text
- House of Lords Justice and Home Affairs Committee, ‘Technology rules? The advent of new technologies in the justice system’, 30 March 2022, HL Paper 180 of session 2021–22, p 15. Return to text
- House of Lords Justice and Home Affairs Committee, ‘Government response to the Justice and Home Affairs Committee’s letter on live facial recognition (LFR) technology’, 25 March 2024. Return to text
- As above, p 2. Return to text
- As above, p 7. Return to text
- As above, p 3. Return to text