top of page

Scanning the Streets: Should the UK Allow Facial Recognition in Public Spaces?


ree

Can the United Kingdom Justify Facial Recognition in a Liberal Democracy?


Police forces in the United Kingdom now use live facial recognition (LFR) in public places

such as train stations, shopping centres and major events. The technology identifies

individuals by extracting biometric data without notice or consent, raising constitutional and ethical concerns. Most notably, it operates without specific statutory authorisation from

Parliament. These concerns lead to a central question: can the United Kingdom defend facial recognition while preserving the principles of a liberal democracy?


Facial recognition is often presented as a tool for public safety, yet its unchecked use

challenges the very freedoms it claims to protect. Questions about legality, equality, and

democratic oversight lie at the heart of this debate. Beyond its technical appeal, LFR

represents a deeper tension between security and liberty – between a state’s desire to prevent harm and a citizen’s right not to be watched. Whether the UK can justify such surveillance within a liberal democracy depends on how it confronts these tensions and reconciles innovation with the fundamental values of consent, transparency and justice.


Legal Accountability and Democratic Oversight


A liberal democracy regulates state power through law, transparency and public consent. Live facial recognition challenges this structure. Police forces rely on broad powers such as the Data Protection Act 2018 rather than on a specific Parliament-approved law designed for

biometric technology (House of Commons Library, 2024). General data protection law does

not regulate who may be scanned, how long biometric data may remain in police hands or

how supervisory bodies should monitor the technology.


The Court of Appeal exposed this gap in Bridges v South Wales Police (2020) by ruling that

South Wales Police violated privacy rights because its use of LFR lacked clear safeguards

and criteria. The Court highlighted a legal vacuum that allows police discretion without

democratic oversight. Even after the ruling, Parliament has not passed a law to govern LFR,

so police forces continue to use biometric surveillance without a formal mandate from elected representatives.


In response to public criticism, the National Police Chiefs’ Council (NPCC) argues that LFR

fits within existing policing powers and strengthens public protection by helping identify

suspects of serious crimes (NPCC, 2024). This defence assumes legal adequacy without

addressing the constitutional problem: police operational powers do not replace the need for parliamentary scrutiny. A democratic state cannot expand surveillance on the basis of internal interpretation. It must rely on visible, public law.


The Hidden Bias in Biometric Policing


Facial recognition software does not recognise all faces with equal accuracy. A study

published in IEEE Transactions on Technology and Society in 2020 demonstrated that LFR

software produces more errors when it scans darker-skinned individuals and women,

especially during movement or in low-light settings (Drozdowski et al., 2020). These

inaccuracies predictably increase suspicion, police intervention and potential detention for

specific demographic groups.


Political philosopher Nancy Fraser explains that a society becomes unjust when its

institutions deny equal participation and recognition to some of its members (Fraser, 1893 ).

LFR reinforces this injustice by embedding unequal treatment into the most coercive function of the state: policing. The technology uses physical traits to assign disproportionate risk to already marginalised communities. These outcomes contradict the democratic promise of equal protection before the law.


The technology supplier NEC Corporation claims that system training improvements will

reduce these biases over time (NEC, 2025). However, police forces already deploy the

technology, so future improvements do not resolve the current harms. Algorithmic reform

cannot justify discriminatory surveillance now. A democratic state cannot expose individuals to unequal policing while waiting for a promised technical fix.


Trust Breaks Down When Surveillance Operates in the Shadows


Live facial recognition raises a final concern: democratic legitimacy. A democratic state

requires public trust and accountability, especially when it uses intrusive measures. However, police forces deploy LFR without evidence that it reduces crime or provides a measurablepublic benefit. There is no independent, peer-reviewed evaluation of LFR outcomes in the United Kingdom, yet authorities continue to use it in public settings.


Political theorist Michael Walzer argues that states often introduce surveillance powers under claims of necessity and then normalise them without public contestation. The United

Kingdom illustrates this pattern. Police use LFR in spaces where people move freely without

expecting identity extraction. Authorities do not publish deployment data or disclose whether biometric matches lead to successful interventions. This practice weakens public trust and obscures whether LFR serves democratic or bureaucratic interests.


The Metropolitan Police Service defends LFR by comparing it to CCTV and describing it as

a tool that increases efficiency in identifying individuals already wanted by law enforcement. This analogy hides a critical distinction. CCTV observes. LFR identifies and records biometric traits to match faces with police databases. This capability alters the meaning of public space and turns civilian movement into data capture. No democratic justification can rest on a comparison that conceals the reality of biometric surveillance.


Policy Recommendations


  1. Create a Dedicated Act of Parliament Regulating Facial Recognition


Groups such as Liberty UK and Big Brother Watch show that LFR operates without

democratic approval and without clear rules for data use, storage or deletion (Liberty, 2020).

Parliament must pass a law that explicitly defines the conditions under which LFR may

operate. This law must require pre-authorisation for deployments, independent audits and

judicial remedies for individuals affected by unjust identification. A dedicated Act of

Parliament would establish a legal foundation where none currently exists and require public authorities to justify LFR instead of assuming its legitimacy.


  1. Mandate Public Transparency and Data Disclosure for All LFR Deployments


The Home Office must require public bodies using LFR to publish annual deployment reports that include the purpose of each deployment, the number of scans conducted, demographic error rates and the retention period for biometric data. In 2021, Amnesty International UK documents how secrecy in policing disproportionately harms already over-policed communities and removes the possibility of public contestation . Public reporting would replace opacity with democratic accountability and allow citizens, courts and legislators to assess whether LFR protects or undermines public safety. Transparency rebuilds trust and forces public authorities to defend biometric surveillance with evidence, not assumption.


Conclusion


The United Kingdom cannot, at present, justify the deployment of live facial recognition in

public spaces. The technology operates without a clear statutory foundation, reinforces

existing structural inequalities, and lacks meaningful democratic oversight. In a liberal

democracy, the use of biometric surveillance must be grounded in legality, necessity, and

proportionality, supported by public consent and judicial accountability. Absent such

safeguards, the continued expansion of facial recognition risks transforming public space into an environment of constant observation and eroding the constitutional principles it purports to uphold. Legislative intervention and comprehensive regulatory oversight are therefore essential to ensure that technological innovation does not outpace the rule of law.



Bibliography

Drozdowski, Pawel, et al. “Demographic Bias in Biometrics: A Survey on an Emerging

Challenge.” IEEE Transactions on Technology and Society, vol. 1, no. 2, 2020, pp. 89–103. https://ieeexplore.ieee.org/document/9091342.


House of Commons Library. Police Use of Live Facial Recognition Technology. House of Commons, 2024. https://commonslibrary.parliament.uk/research-briefings/cdp-2024-0144/./


Liberty. “Police Use of Facial Recognition Technology.” Liberty Human Rights, 2020.

https://www.libertyhumanrights.org.uk/issue/legal-challenge-ed-bridges-v-south-wales-police/.


NEC Corporation. “NEC's Cutting-edge AI Technologies Guide Book.” NEC Global, 2024.https://www.nec.com/en/global/solutions/ai/download/necaiguidebook/NEC_AI_Guide_Book_en.pdf


Fraser, Nancy. Justice Interruptus: Critical Reflections on the ‘Postsocialist’

Condition. Routledge, 1997.


Rawls, John. A Theory of Justice. Harvard University Press, 1971.


Walzer, Michael. Just and Unjust Wars. Basic Books, 2006.


Durham Think Tank is a Durham SU student group whose details are: Durham Students’ Union (also known as Durham SU or DSU) is a charity registered in England and Wales (1145400) and a company limited by guarantee (07689815), and its principal address is Dunelm House, New Elvet, DURHAM, County Durham, DH1 3AN
bottom of page