Metropolitan Police to Pilot Facial Recognition for Identity Verification
The Metropolitan Police is set to launch a pilot programme that will see officers using automated facial recognition technology to scan citizens' faces for identity checks. This initiative, confirmed by Mayor Sadiq Khan, involves 100 officers employing handheld devices over a six-month period. The move aims to enhance policing efficiency but has sparked controversy, with opponents labelling it "alarming" due to concerns over privacy and accuracy.
Expansion of Facial Recognition in Policing
This pilot represents a significant extension of facial recognition technology in UK law enforcement. Previously, such systems have been deployed via cameras on vans and fixed locations in areas like Croydon, Manchester, and South Wales. Additionally, retrospective facial recognition is already widely used across the country. The Met's website had previously stated it "does not presently use the so-called operator initiated facial recognition," making this new pilot a notable shift in policy.
Controversy and Criticism
The announcement follows recent incidents highlighting potential flaws in the technology. For example, the Guardian reported a case where police arrested an Asian man for a burglary 100 miles away after software misidentified him, raising issues of racial bias. Zoë Garbett, the Green party London Assembly member who prompted Khan's disclosure, criticised the pilot as "an alarming change" that alters the public's relationship with police. She argued it allows officers to "walk up and scan people's faces" without a clear legal framework, threatening fundamental rights.
Khan defended the technology, stating it would only be used during police stops or when officers doubt a person's identification, potentially avoiding unnecessary arrests. He emphasised that the alternative might involve detaining individuals and taking them to a station, which this device could prevent.
Regulatory and Ethical Concerns
The pilot emerges amid calls from the Equality and Human Rights Commission for an independent oversight body to regulate facial recognition use in the UK. Mary Ann Stephenson, chair of the watchdog, highlighted risks of inaccuracy and racial disparities in false identifications, urging a stronger legal framework. Meanwhile, Policing Minister Sarah Jones has praised the technology as a breakthrough comparable to DNA matching, with a recent Home Office consultation exploring its expansion.
In South Wales, operator-initiated facial recognition is already in use, with police employing NEC's "NeoFace" algorithm on smartphones for identifying missing persons or those at risk. However, civil liberties groups like Big Brother Watch warn that vague definitions could lead to misuse in non-crime scenarios.
Future Implications and Stakeholder Engagement
Khan has committed to consulting stakeholders, including the London policing ethics panel, on legal, ethical, and community impacts. The Met reported that earlier pilots, such as the Croydon live facial recognition initiative using lamp-post cameras, led to over 100 arrests in three months. As the debate continues, this pilot underscores the balancing act between leveraging AI for public safety and safeguarding civil liberties in an increasingly digital policing landscape.
