Facial recognition technology is a powerful tool to help fight crime but several cities have chosen to ban its use citing concerns over privacy.
In the US, regulations are being implemented at a local city level: San Francisco and Oakland in California and Somerville in the state of Massachusetts have all introduced bans, and Cambridge, also in Massachusetts, followed suit this month.
A recent new study by the National Institute of Standards and Technology (NIST) found that algorithms misidentified African-American and Asian faces up to 100 times more frequently than white faces.
“While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” said Patrick Grother, a NIST computer scientist and the report’s primary author. “While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.”
Despite the apparent risks of misidentification, a recent poll from the non-profit research institute Center for Data Innovation found that only 26 percent of US residents wanted the federal government to strictly limit facial recognition, and public support for banning facial recognition dropped even more when told the technology would have uses such as improved public safety or preventing shoplifting.
“Blanket bans on government use are completely unjustified,” Jake Parker, senior director of government relations for the Security Industry Association, told Cities Today. “The benefits of facial recognition have been proven for more than a decade in US applications–to find missing and exploited children and to aid law enforcement investigations to name just two–without evidence of unlawful or significant misuse.”
In the UK’s capital city, London, live facial recognition (LFR) is being used by the Metropolitan Police to track wanted criminals.
LFR cameras stream live images of people who pass through their area of focus. The images are compared to those on a central watchlist of offenders by measuring facial structure–the distance between the eyes, nose, mouth and jaw. When a match is made, an alert is sent to the police and an officer then compares the images to determine if the camera has identified a wanted criminal.
So far 10 trials have been completed in a range of different environments. During the London trial of LFR, eight arrests were made as a direct result of using the system.
The Mayor of London, Sadiq Khan, has cautiously accepted the use of LFR, saying that while new technology has a role in keeping Londoners safe, the police must be proportionate and transparent about how it is deployed.