A Group of AI Researchers Is Calling On Amazon To Stop Selling Its Rekognition System To Police
Photo Credit: Viw on the German HQ of the the online shopping giant Amazon in Munich. (Photo by Alexander Pohl/NurPhoto via Getty Images)

A Group of AI Researchers Is Calling On Amazon To Stop Selling Its Rekognition System To Police

On Wednesday, researchers from Google, Facebook, Microsoft, and top universities — including Turing Award winner, Yoshua Benigo — published an open letter calling on Amazon to stop selling its facial recognition tech to police.

Amazon’s Rekognition system is perhaps one of its most infamous tools. In May 2018, documents obtained by the American Civil Liberties Union (ACLU) of Northern California revealed Rekognition was in use by police in Orlando and Oregon.

Then in October of that same year, documents obtained by the Project on Government Oversight (POGO) revealed Amazon pitched Rekognition to US Immigration and Customs Enforcement (ICE).

This came after studies uncovered issues with Amazon’s program. In July 2018, the ACLU found Rekognition incorrectly matched 28 members of Congress to mugshots — six were members of the Congressional Black Caucus.

In the letter, the researchers looked at one particular study that found Rekognition had great errors in trying to recognize darker-skinned women. In January 2019, Amazon’s Matthew Wood and Michael Punke published blog posts trying to say the tool was incorrectly used.

The researchers went through several facts “reinforcing the importance of the study” and highlighted how Wood and Punke’s blog posts “misrepresented the technical details for the work and the state-of-the-art in facial analysis and facial recognition.”

Wood’s blog post centered on trying to discredit the study by saying it used facial analysis as a proxy. However, the researchers pointed out there is an “indirect or direct” relationship between modern facial analysis and face recognition.

“So in contrast to Dr. Wood’s claims, bias found in one system is cause for concern in the other, particularly in use cases that could severely impact people’s lives, such as law enforcement applications,” the researcher’s wrote.

Most importantly, the researchers pointed out there are no laws or required standards when it comes to using Rekognition. Although Amazon tries to claim researchers misuse the tool, police have come forward to say they’re using it the exact same way.

Currently, it’s unclear how many law enforcement agencies are using Rekognition because Amazon won’t come clean about that information.

Amazon’s Rekognition is an example of how technology can serve to only further exacerbate pre-existing social problems. Surveillance of Black communities is already commonplace, and this technology serves to make it easier.