CARDIFF, Wales — In a battle over the limits of high-tech police surveillance, Big Brother has just won a round.
On Wednesday, a U.K. court dismissed a case brought against the South Wales Police over its use of live facial recognition technology, which allows police to scan the faces of thousands of individuals in real time and match their likenesses against a database of suspects.
In its ruling, the court argued that the use of such technology is covered by existing law, rejecting arguments brought by privacy campaigner Ed Bridges and the London-based advocacy group Liberty that the tools deployment amounts to a violation of human rights and data protection law.
Speaking in High Court, Lord Justice Haddon-Cave said that “the current legal regime is adequate to ensure the appropriate and non-arbitrary use of AFR Locate” — the technology used by South Wales Police.
The decision, which marks a first in European courts, deals a blow to Bridges and other campaigners who say that large-scale deployment of facial recognition by police has a cooling effect on protesters and ultimately poses a threat to democratic freedoms.
“It can be used for really noble and well-meaning ends, but it can also be used maliciously, and it can also be used for well-intended purposes that have very serious social consequences” — Karen Yeung, University of Birmingham Law professor
At a time when pro-democracy protesters in Hong Kong have been filmed sawing down facial recognition “towers” in an effort to avoid detection and arrest, the ruling suggests that courts in the U.K. and elsewhere in Europe may not stand in the way of further so-called experiments with the technology by police across the European Union, despite the blocs far-reaching privacy protections.
Indeed, as the ruling underscores, Europes General Data Protection Regulation (GDPR) includes exemptions for the collection of biometric data like facial likenesses by authorities, even though such information is considered “sensitive” and highly restricted in the hands of private companies.
Reacting to the ruling, Britains Information Commissioners Office warned about the risks of deploying a technology which it termed “intrusive” and which has the potential, “if used without the right privacy safeguards, to undermine rather than enhance confidence in the police.”
The watchdog added that it would take the courts findings into account when formulating new guidance for police following the conclusion of an investigation into police use of facial recognition.
Elsewhere in the EU, privacy watchdogs will also be paying attention to the ruling.
From high schools in France to football stadiums in Copenhagen and a train station in Berlin, police are charging ahead with similar experiments with facial recognition, prompting activists, lawmakers and officials at the European Commission to call for clearer guidelines on how the tools can be deployed and what safeguards police must provide for privacy.
After the city of San Francisco banned public use of facial recognition, the incoming Commission is examining ways to restrict the technologys deployment and limit broadening surveillance of European citizens, according to documents obtained by POLITICO.
Despite the adverse ruling, Bridges has started a debate that could well end up being decided on the other side of the Channel.
“Were very likely to see facial recognition before the European Court of Human Rights,” said Daragh Murray, a human rights law expert at the University of Essex, adding: “And this may well be the first instance.”
Liverpool FC fan vs Big Brother
Advocates of automatic facial recognition underscore its potential to improve public safety. Former U.K. Home Secretary Sajid Javid referred to the technology as “game-changing” in the fight against child abusers, as did Chris Phillips, former head of the National Counter Terrorism Security Office.
“If there are hundreds of people walking the streets who should be in prison because there are outstanding warrants for their arrest, or dangerous criminals bent on harming others in public places, the proper use of AFR has a vital policing role,” Phillips said in May.
But critics see a sinister potential. At least 18 countries are now importing the Chinese technology that Beijing uses to racially profile and closely track the countrys 11 million Uighurs, a Muslim minority in Chinas northwest, amid reports that the technology disproportionately misidentifies ethnic minorities.
“It can be used for really noble and well-meaning ends, but it can also be used maliciously, and it can also be used for well-intended purposes that have very serious social consequences,” said Karen Yeung, a University of Birmingham Law professor and AI expert.
In Europe, police use of facial recognition is set to expand in the absence of any overarching legal framework.
“Im not someone whos shy about putting their head above the parapet” — Ed Bridges, privacy campaigner
And as Ed Bridges lawsuit shows, average citizens are severely limited in their abRead More – Source