As awareness around facial recognition continues to grow, a primary concern has been its potential to open up new frameworks for mass surveillance. That concern grew even more pressing as people realized that facial recognition could potentially be used in body cameras, essentially creating roving, real-time surveillance systems on the chests of police. 

On Thursday, Axon — the company that created the Taser and supplies 47 out of the 69 largest police agencies in the United States with body cameras and software — announced a ban on the use of facial recognition on its devices. Although this can certainly be considered a temporary victory, Axon’s announcement must be carefully analyzed — both within social contexts, the words that the company used, and its own history. 

Axon’s decision comes from the first report of an AI and Policing Technology Ethics Board that the company originally formed in April of 2018. The board was developed to lead Axon in ethically developing products and examining their impact on communities. The board wrote:

“Face recognition technology is not currently reliable enough to ethically justify its use on body-worn cameras. At the least, face recognition technology should not be deployed until the technology performs with far greater accuracy and performs equally well across race, ethnicities, genders, and other identity groups. Whether face recognition on body-worn cameras can ever be ethically justifiable is an issue the Board has begun to discuss, and will take up again if and when these prerequisites are met.”

Within the board’s decision, careful attention must be paid to the phrasing “not currently reliable enough” and the continued suggestion that the technology can reach a point of reliability. Indeed, facial recognition has a multitude of problems when it comes to recognizing — or even seeing — anyone who isn’t a white man. Amazon’s Rekognition is an easy program to examine in order to see some of facial recognition’s issues. 

In July 2018, the ACLU reported that Rekognition incorrectly matched 28 members of Congress to mugshots, including six members of the Congressional Black Caucus. Early this year, researchers from M.I.T. Media Lab found that Rekognition had greater errors in trying to recognize darker-skinned women. While testifying at a House hearing where advocates were requesting a moratorium on facial recognition, one of the researchers, Joy Buolamwini — who also founded the Algorithmic Justice League — said about her research:

“In one test, Amazon recognition even failed on the face of Oprah Winfrey labeling her male. Personally, I’ve had to resort to literally wearing a white mask to have my face detected by some of this technology. Coding in white face is the last thing I expected to be doing at M.I.T., an American epicenter of innovation.”

On its face, all of this seems to be evidence that supports the Board’s conclusion. Facial recognition isn’t ready to be commercialized — yet. However, the Board misses perhaps the most obvious factor of this discussion: the role of the police. Although facial recognition may be riddled with errors, even a perfect program does not belong in the hands of police departments who would then be able to turn it against already vulnerable communities. Beyond the police themselves, the society that we live in has created conditions where facial recognition simply shouldn’t exist because it impacts will not be evenly distributed. 

Axon may have tried to paint itself as doing some good by creating this ethics board, but the best thing the company could do is not exist. This is, after all, the developers of the Taser. Although no government agency really tracks Taser use or its impacts, a 2019 Reuters review of police records, news reports, and court documents found that at least 49 people died after being shocked by police with a Taser in 2018 alone. 

There will never be a point where police use new technologies like facial recognition for the benefit of marginalized communities because police do not exist for the benefit of the oppressed. Nothing makes that more clear than body cameras themselves. Although they were often touted as solutions to police brutality, people are now rushing to ensure they can’t be used to mass surveil people for simply existing on the streets.

Ultimately, Axon’s decision is temporary relief disconnected from history. The company is waiting on a “maybe” that should never come. Facial recognition will never be reliable enough for police use or incorporation into body cams.