GMLC campaign volunteer Mariam Hussain reflects on her research into the use of face recognition technologies at protests, arguing that we need legislation to clarify the legality of its use and protect protest rights.
Last autumn I conducted a research project on the lawfulness of artificial face recognition (FRT) being used by police at protests. FRT processes sensitive biometric data instantaneously.[1] Following the riots Britain saw last summer, FRT may be welcomed by many to identify members of society suspected of criminal activity. However, should the police be using FRT on people not suspected of criminal activity, and more specifically, protestors? A recent wave of legislation has increased police powers and enabled harsher sentencing,[2] but is sufficient attention being paid to protecting our rights, and more specifically those of protestors?
FRT is known to misidentify at a disproportionate rate,[3] and a significant concern is the impact its deployment at protests will have on only furthering marginalisation and cementing racial prejudices, something the police is all too familiar with, given reports on institutional racism with forces.[4] Therefore, with no clear legal framework governing police use of FRT at protests, I sought to identify whether such use is truly lawful under current law.
What is FRT?
FRT captures a facial image and uses algorithms to compare biometric features to those within images collected in a ‘watchlist’ to produce matches. There are various types of FRT that can be used to identify individuals. My analysis of the law applies generally to FRT.
It should be noted that lawful use of FRT can be very effective and has proved as such – for example by assisting the police in identifying a wanted sex offender at the King’s Coronation.[5] However, some academics argue its deployment at protests risks creating a ‘chilling effect’: protestors may fear criminalisation for protesting and not feel fully able to exercise their freedom of expression as protected under the Human Rights Act 1998.[6] This issue is further exacerbated with the current government’s proposal to criminalise wearing face coverings at protests.[7] This creates a situation where, if protestors wear masks, they risk being arrested, but if they don’t wear a mask, they risk the police obtaining and storing their biometric data without their consent.
Moreover, FRT has issues with accuracy: it is widely known that it disproportionately misidentifies women of colour.[8]
Another issue regards the curation of watchlists: police do not just include people who have committed a crime, but can also go as far to use images from social media.[9]
Even from the view of the police, its use is risky. The Post Office Scandal showed us how despite being aware of technological issues, people tend to trust technology over humans. A study assessing police perceptions of FRT showed that some officers viewed FRT as ‘inaccurate’ and unnecessary, but remained willing to use it.[10]
How are police forces using it?
At the time I concluded my research in 2024, the Met Police in London and South Wales Police (SWP) had been at the forefront of FRT trials in the UK. Now, every force in the UK is using some form of FRT. Some forces use ‘live facial recognition’, which involves surveilling public spaces in real-time; others use ‘operator-initiated facial recognition’ which enables officers to scan and try to identify anyone they come into contact with. ‘Retroactive facial recognition’ is used by forces utilising images from social media accounts to create identifications.[11] The Met Police announced it plans to install permanent facial recognition cameras in Croydon, continuously scanning people every day.
What does the law say?
There are a number of potential legal questions here. FRT may infringe upon rights protected under the Human Rights Act (HRA) 1998, Data Protection Act 2018, and the Equality Act 2010. With a specific focus on protests, Articles 10 and 11 (freedom of expression, and freedom of assembly and association) of the HRA are of particular concern because these form the basis of one’s right to protest.
Academics have conducted independent reviews of various FRT trials conducted across the country. Fussey and Murray (2019) published one examining six Met Police trials, finding that the deployment of FRT was unlikely to satisfy the test necessary for an interference with human rights conventions.[12]
In terms of case law, the Court of Appeal saw the case of Bridges in 2020, finding that the SWP’s deployment was not ‘in accordance with the law’ for the purposes of Article 8 (right to privacy), because SWP failed to outline its watchlist criteria, enabling officers to have too much discretion.[13] However, in terms of case law specific to protestors’ rights, we have yet to see what the courts will decide.
Consequently, for my research I focused on Article 10, which provides the basis for the right to protest. For this right to be lawfully interfered with, it must be 1) in accordance with the law, 2) pursue a legitimate aim, and 3) necessary in a democratic society.[14]
- To be in accordance with the law, a legal mandate must exist and be of ‘sufficient quality to protect against arbitrary interferences’.[15] Currently, no such explicit mandate exists for police FRT use. Not much has happened with the Automated Facial Recognition Technology Bill introduced in the House of Lords in 2020, and even that proposed legislation lacks sufficient attention to FRT used at protests. Police forces using FRT have referred to an implicit mandate from various primary sources,[16] two of which state that there may be a potential for an implicit mandate.[17] The common law is one implicit mandate, as it provides police with powers to protect life and property, preserve order and prevent and detect crime.[18] However, it is unlikely to comprehensively deal with the complexities of FRT use. In my LLM, I go into more detail about the unlikeliness of a primary source existing as a legal mandate, and the slim potential for secondary sources to be of sufficient quality to pass this test.
- If police use FRT at protests for public safety or preventing disorder, this may be deemed a legitimate aim to pursue, satisfying this limb.
- However, for the interference (or use of FRT at protests here) to be necessary in a democratic society, it must answer a ‘pressing social need, be proportionate to the legitimate aim pursued, and the reasons justifying its use must be relevant and sufficient’.[19] Whilst FRT can be used to benefit the public, for example by identifying a wanted sex offender at the King’s Coronation, for a political public space, surely there is an additional concern to acknowledge. If deployed at protests, the concern is that it can cause a chilling effect on the right to protest being exercised. The UK has seen an increase in pro-Palestine protests, and despite claims from politicians regarding disruption, arrest rates have been low, thus illustrating that there is not a pressing need for surveilling protestors en masse.[20]
Officers themselves have also not deemed FRT necessary in a study done by Urquhart and Miranda: they said that the type of crime in their locality meant that identification was not a significant issue for them.[21] I went into depth to consider proportionality, considering issues regarding watchlists accuracy and the need to subject everyone to unnecessary processing. The police may well in fact provide reasons to justify its use that pass the test, but as of the time of concluding my research, no such sufficient reasons were provided, and in any case, it is essential that they be provided before the police deploy FRT.
Consequently, without a clear legal mandate, explicit or implicit, governing its use, it is argued that FRT use at protests is not in accordance with the law, and whilst it may be said to pursue a legitimate aim, in its current state it cannot be deemed necessary in a democratic society.
What should happen?
Some academics maintain that secondary legislation or local policies will be sufficient to enable lawful deployment, arguing that primary legislation would be rushed and there are no gaps in the law warranting it.[22] However, FRT is novel.[23] It has the potential to be so invasive and impactful on protesting that there is a compelling case for primary legislation. Primary legislation does not need to be rushed: the government should take time to carefully consider stakeholders, and experts’ opinions. A centralised approach will provide for a level of standardisation necessary for its use at protests, protecting against significant inconsistencies that a decentralised approach would give rise to.
The right to protest is at stake here. We cannot ignore the fact that police are increasingly using FRT in public spaces, and the potential chilling effect this will have on protests. We need legislation now.
______________________
Greater Manchester Law Centre recognises that many of the rights and freedoms we use to support our communities today were won through popular protest and demand. We support a strengthening of people’s rights to protest and oppose repressive legislation that seeks to increase sanctions for peaceful protest. To see more on our campaigning to defend our rights, see this page on our website.
Photo credit: Steve Eason, Flickr, 5/07/25, photo from a London protest of the proscription of Palestine Action.
References
[1] Evani Radiya-Dixit, ‘A sociotechnical audit: assessing police use of facial recognition’ (2022) Appendix A.
[2] Police, Crime, Sentencing and Courts Act 2022; Public Orders Act 2023; and the Serious Disruption Regulations 2023.
[3] Joy Buolamwini and Timnit Gebru, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ in Conference on fairness, accountability and transparency’ (2018) PLMR 77.
[4] Monika Zalnieriute, ‘Facial recognition surveillance and public space: protecting protest movements’ (2024) IRL 1; Macpherson Report
[5] Matt Mathers, ‘Wanted sex offender caught on facial recognition camera at King Charles coronation’ Independent, (31 July 2023)
[6] Monika Zalnieriute, ‘Facial recognition surveillance and public space: protecting protest movements’ (2024) IRL 1.
[7] Crime and Policing Bill
[8] Joy Buolamwini and Timnit Gebru, ‘Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification’ in Conference on fairness, accountability and transparency’ (2018) PLMR 77.
[9] Peter Fussey and Daragh Murray, ‘Independent report on the London Metropolitan Police Service’s trial of live facial recognition technology’ (2019) 23.
[10] Lachlan Urquhart and Diana Miranda, ‘Policing Faces: the present and future of intelligent facial surveillance’. (2021) 31(2) ICTL
[11] Peter Fussey and Daragh Murray, ‘Independent report on the London Metropolitan Police Service’s trial of live facial recognition technology’ (2019) 23.
[12] ‘Independent report on the London Metropolitan Police Service’s trial of live facial recognition technology’ (2019) 23.
[13] Bridges [2020] 1 WLR 5037 [25].
[14] Big Brother Watch and Others v UK (58170/13, 62322/14 and 24960/15) Unreported, 13 September 2018 (ECHR) 439.
[15] Bridges [2020] 1 WLR 5037 [25].
[16] HRA 1998; Freedom of Information Act 2000; Protection of Freedoms Act 2012; Data Protection Act 2018; Regulation of Investigatory Powers Act 2000; and the common law. Cited in ‘Live Facial Recognition, (LFR) MPS Legal Mandate’, 23 July 2018, p.4, and ‘Live Facial Recognition, (LFR) Operational Mandate’, 16 January 2019, p.4.
[17] ‘Independent report on the London Metropolitan Police Service’s trial of live facial recognition technology’ (2019) 50.
[18] Metropolitan Police, ‘Live Facial Recognition: Legal Mandate’ (MPS 2024).
[19] Article 10(2)
[20] Metropolitan Police, ‘Operation Brocks: Public Order Arrest Data’ (MPS 2023).
[21] Lachlan Urquhart and Diana Miranda, ‘Policing Faces: the present and future of intelligent facial surveillance’. (2021) 31(2) ICTL
[22] Asress Adimi Gikay, ‘Regulating use by law enforcement authorities of live facial recognition technology in public spaces: an incremental approach’ (2023) 82(3) CLJ 437.
[23] Bridges [2020] 1 WLR 5037 [86]







