埃塞克斯警方在研究发现种族偏见后暂停使用面部识别摄像头。
Essex police pause facial recognition camera use after study finds racial bias

原始链接: https://www.theguardian.com/technology/2026/mar/19/essex-police-pause-facial-recognition-camera-use-study-racial-bias

埃塞克斯警方在剑桥大学的一项研究揭示出显著的种族偏见后,已暂时停止使用实时面部识别(LFR)技术。该研究表明,LFR系统在正确识别黑人个体方面的可能性在统计上高于其他种族群体,引发了公平性担忧,尽管整体准确率很高。 信息专员办公室(ICO)强调了暂停,并警告正在使用LFR(目前英格兰和威尔士共有13个警察部队)的其他警察部队解决潜在的偏见问题。尽管如此,内政大臣此前宣布计划大幅增加全国可用的LFR车辆数量。 虽然该研究发现对无辜人员的错误识别案例很少,但专家认为偏见可能的原因包括算法过度训练。这些发现强化了人们对人工智能驱动的监控公平性的现有担忧,批评者如“老大哥观察”呼吁暂停部署,直到准确性和偏见得到明确解决。

## 面部识别偏见导致埃塞克斯警方暂停使用 埃塞克斯警方在研究显示存在种族偏见后,暂停了其面部识别技术的使用。该系统对黑人个体的*正确*识别率高于其他族裔群体,引发了关于公平性和潜在歧视的争论。 之前的担忧主要集中在该技术*无法*准确识别有色人种,而这个问题则引发了对不成比例影响的质疑。专家认为问题不在于技术本身——解决偏见的方案早在几年前就已开发出来——而是由于削减成本导致警方采用来自较新、经验不足的公司,且模型训练不足的系统。 评论员强调了准确面部识别所需的大量数据,并指出了最近的误识别事件(将人识别为动物或车辆)。核心伦理困境在于,一个对一个群体*更好*的系统,可能导致不成比例的审查,是否可以接受。一些人认为应该加强执法,而另一些人则强调了系统性偏见和错误逮捕的危险。
相关文章

原文

Essex police have paused the use of live facial recognition (LFR) technology after a study found cameras were significantly more likely to target black people than people of other ethnicities.

The move to suspend use of the AI-enabled systems was revealed by the Information Commissioner’s Office (ICO), which regulates the use of the technology deployed so far by at least 13 police forces in London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex.

The ICO said Essex police had paused LFR deployments “after identifying potential accuracy and bias risks” and warned other forces to have mitigations in place. LFR systems are either mounted to fixed locations or deployed in vans. In January, the home secretary, Shabana Mahmood, announced the number of LFR vans would increase five-fold, with 50 available to every police force in England and Wales.

Essex commissioned University of Cambridge academics to conduct a study, which involved 188 actors walking past cameras being actively deployed from marked police vans in Chelmsford. The results were published last week and showed about half of the people on a watchlist were correctly identified and incorrect identifications were extremely rare, but the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

Live facial recognition vans are being made available more widely to police forces across England and Wales. Photograph: Andrew Matthews/PA

This “raises questions about fairness that require continued monitoring”, the report concluded. One of its authors, Dr Matt Bland, a criminologist, told the Guardian and Liberty Investigates: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation.”

The problem differs from the more common public concern about the technology which is that it identifies innocent people. Last month it emerged that police arrested a man for a burglary in a city he had never visited 100 miles away after retrospective face scanning software confused him with another person of south Asian heritage.

Possible reasons for the latest issue with LFR include overtraining of the algorithm on the faces of black people. Experts believe it could be rectified by adjusting system settings. A separate study of the same technology by the government’s National Physical Laboratory found black men were most likely to be correctly matched by the system and white men least likely, but the effect was not statistically significant.

The Home Office has said LFR cameras deployed in London from January 2024 to September 2025, led to more than 1,300 arrests of people wanted for crimes including rape, domestic abuse, burglary and grievous bodily harm. But opponents of facial recognition technology said the latest research showed warnings about bias in LFR technology were being borne out.

“Police across the country must take note of this fiasco,” said Jake Hurfurt, the head of research and investigations at Big Brother Watch. “AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”

Essex police were contacted for comment.

联系我们 contact @ memedata.com