When it comes to accuracy, facial recognition software continues to struggle with anyone who isn’t a white man. Amazon’s Rekognition program is especially notorious for providing false matches.
In July 2018, the American Civil Liberties Union (ACLU) found that Rekognition mismatched 28 members of Congress to mugshots — six were members of the Congressional Black Caucus. Then, a January 2019 study found the program had greater errors in trying to recognize darker-skinned women.
Despite that, Amazon continued to defend its software — which had already been sold to police in Orlando, the Washington County Sheriff’s Office in Oregon, and peddled to the Department of Homeland Security’s Immigration and Customs Enforcement.
Amazon’s primary defenses centered around accusing researchers of incorrectly using the program, despite police saying they used it the exact same way.
Now, a report by The Washington Post has highlighted serious misuse of the program — and it’s coming from Washington County police in Oregon. Officers are running Black and white sketches through Amazon’s Rekognition software to find a match.
Amazon’s program already struggles to recognize actual pictures. Throwing in sketches increases the likelihood of having a false match, AI experts told the Washington Post and Business Insider.
“This adds another layer of complexity that will likely increase error rates,” Privacy International’s Frederike Kaltheuner told Business Insider. “Generally speaking, we are quite concerned about the use of facial recognition by police departments — both when it works and when it doesn’t. When it works it turns people into walking ID cards, when it doesn’t it risks incriminating the innocent who then have to prove that they are not guilty,” Kaltheuner added.
Last month, AI researchers — including Turing Award winner, Yoshua Bengio — called on Amazon to stop selling its Rekognition system to police. One major issue for the researchers is that there’s currently no safeguards in place.
Although Amazon now recommends law enforcement only accepts a match if the program is 99 percent confident, the Post reported that Washington County officers don’t follow that guideline.
The police aren’t actually shown ratings when using Rekognition, they just go with the five possible matches for each search — irrespective of how confident the software is.
In May 2018, a coalition of groups — including the ACLU and Council on American-Islamic Relations — wrote an open letter to CEO Jeff Bezos:
“We demand that Amazon stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the country,” the groups wrote. “Amazon should not be in the business of providing surveillance systems like Rekognition to the government.”
Later this month, Amazon investors will officially vote on a proposal to stop the company from selling Rekognition to government agencies. For privacy and civil rights advocates, the fact that the program was sold at all — despite its errors and Amazon’s inability to enforce proper guidelines — is alarming.