Police Facial Recognition Is An Authoritarian And Oppressive Surveillance Tool

Are you being watched? Police admit to taking

Are you being watched? Police admit to taking

Big Brother Watch said police systems had wrongly flagged thousands of innocent people, and was concerned that photos dubbed as "false alarms" were sometimes kept by police for weeks.

Big Brother Watch (BBW), a civil rights organization from the UK that "works to roll back the surveillance state", released a report in which it reveals that the UK Metropolitan Police's experimental facial recognition system is wrong 98% of the time, thus making it virtually useless.

Mike Barton, the National Police Chiefs' Council lead for crime operations, said: "Facial recognition technology has the potential to help us disrupt crime networks and identify people who pose a threat to the public".

Civil liberties group Big Brother Watch today (15 May) published a report outlining serious claims about the accuracy of facial recognition tools employed by United Kingdom law enforcement bodies.

Police have been rolling out the software to be used at major events such as sporting fixtures and music concerts, including a Liam Gallagher concert and worldwide rugby games, aiming to identify wanted criminals and people on watch lists.

In a blogpost published on Monday afternoon, Elizabeth Denham expressed "concerns" over the lack of transparency around the police's deployment of the technology at events such as Notting Hill Carnival and last year's Champions League final in Cardiff.

"One of those people matched was incorrectly on the watch list; the other was on a mental health-related watch list", it said.

Facial recognition is also used by South Wales Police, but 91% of its system's matches were inaccurate, despite the Home Office providing £2.6 million in funding to use the technology.

The Met admitted that as a result of using facial recognition it has stored 102 innocent people's biometrics data for 30 days.

United Kingdom police have admitted to using facial recognition cameras - despite the tactic being labeled as "dangerous and inaccurate" by privacy groups, who say the software is flagging thousands of innocent people.

"Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK".

The use of this technology could breach the Human Rights Act, according to the group.

South Wales Police added that a "number of safeguards" stopped police taking action against innocent people. "Firstly, the operator in the van is able to see that the person identified in the picture is clearly not the same person, and it's literally disregarded at that point", Lewis said. Police must stop using it now.

The 15 NGOs calling for police to stop using automated facial recognition are: Big Brother Watch, Article 19, defenddigitalme, Football Supporters Federation, Index on Censorship, Institute of Race Relations, Liberty, The Monitoring Group, Netpol, Open Rights Group, Police Action Lawyers Group, Race Equality Foundation, Race On The Agenda, Runnymede Trust, Tottenham Rights. Innocent citizens being constantly tracked, located and identified - or, as is now most likely, misidentified as a criminal - by an artificially intelligent camera system conjures up images of futuristic dystopian societies that even Orwell would be proud of.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.