Civil Rights Groups Want To Stop Big Tech From Selling Facial Recognition Software To the Government
Photo Credit: Computer artwork of a printed circuit board (PCB) and a male wireframe head. A PCB consists of tracks of a metal conductor (white) on a board made of insulating material, which form a circuit with connection points (circles). A number of components can be attached at the connection points, including resistors, transistors, microchips, dials and switches. PCBs are used in most electronic products, including mobile phones and computers.

Civil Rights Groups Want To Stop Big Tech From Selling Facial Recognition Software To the Government

Facial recognition technology is the latest tool that big tech is racing to perfect and a coalition of 85 civil rights organizations are trying to stop the country’s largest tech companies from selling it to the government.

The groups, which include the American Civil Liberties Union, Muslim Justice League, Color of Change and the National Immigration Law Center, sent letters today to Google, Microsoft and Amazon urging the companies to not sell their facial recognition technologies to the government.

“History has clearly taught us that the government will exploit technologies like face surveillance to target communities of color, religious minorities, and immigrants,” said Nicole Ozer, Technology and Civil Liberties director for the ACLU of California, in a press release. “We are at a crossroads with face surveillance, and the choices made by these companies now will determine whether the next generation will have to fear being tracked by the government for attending a protest, going to their place of worship, or simply living their lives.”

In January of last year, Google said it “fixed” a flaw in its facial recognition algorithm that misidentified black people as gorillas by blocking the terms “gorilla,” “chimp,” “chimpanzee,” and “monkey.”

Google’s CEO Sundar Pichai outlined the tech giant’s AI principles in a blog,  saying the company wanted to avoid creating and reinforcing unfair biases, aimed to be socially beneficial,  and wanted to avoid injury to people.

In a December interview with the Washington Post, Pichai called fears about artificial intelligence legitimate. Google received backlash from its employees last year after the company worked with the Department of Defense to provide AI that could identify buildings and car tags. The company said that it would not sell its facial recognition technology until its dangers were addressed.

“Google has a responsibility to follow its AI principles,” the coalition said in its letter to the company. “Selling a face surveillance product that could be used by the government will never be consistent with these Principles.”

In a December blog post, Microsoft President Brad Smith highlighted some of the opportunities and issues that come with facial recognition technologies.

“Especially in its current state of development, certain uses of facial recognition technology increase the risk of decisions and, more generally, outcomes that are biased and, in some cases, in violation of laws prohibiting discrimination,” Smith said.

Smith also noted that facial recognition technologies bring new intrusions to people’s privacy and the use of AI by governments “can encroach on democratic freedoms.”

In June, more than 100 Microsoft employees protested the company’s working with ICE after the agency was separating children from their parents at the Southwest border. Microsoft’s employees wrote a letter calling for the end of a $19.4 million contract with the agency.

“As the people who build the technologies that Microsoft profits from,
we refuse to be complicit,” the employees said. “We are part of a growing movement, comprised of many across the industry who recognize the grave responsibility that those creating powerful technology have to ensure what they build is used for good, and not for harm.”

The coalition commended Microsoft for addressing the issues with facial recognition technology and its work with ICE, but called for more action.

“The dangers of face surveillance can only be fully addressed by stopping its use by governments,” the coalition said in its letter to Microsoft. “This technology provides the government with an unprecedented ability to track who we are, where we go, what we do, and who we know.”

Amazon currently sells its Rekognition product to the American government and has worked law enforcement agencies in the past. The ACLU, along with various other civil rights organizations, sent another letter to Amazon CEO Jeff Bezos in May highlighting their concerns over the use of Rekognition on vulnerable communities, protestors and immigrants.

“People should be free to walk down the street without being watched by the government. Facial recognition in American communities threatens this freedom,” the coalition said in its May letter. “In overpoliced communities of color, it could effectively eliminate it.”

Amazon has also pushed for U.S. Immigration and Customs Enforcement to use Rekognition, a move that the coalition called “a threat to the safety of community members.”

In September, seven members of Congress sent letters to the Federal Trade Commission, the Federal Bureau of Investigation and the Equal Employment Opportunity Commission after the ACLU tested Amazon’s face surveillance technology on members of Congress against 25,000 mugshots, which resulted in 28 false matches.  Of those lawmakers mistakenly identified, 39 percent were people of color, including Representatives John Lewis (D-GA), Lacy Clay (D-MO) and Luis Gutiérrez (D-IL).

In Amazon blog post, the company explains that the ACLU’s test was conducted on an 80 percent confidence level, which has a 5 percent misidentification rate. When the test was replicated with a confidence level of 99 percent, the false positive results dropped to zero.

“In real-world public safety and law enforcement scenarios, Amazon Rekognition is almost exclusively used to help narrow the field and allow humans to expeditiously review and consider options using their judgment (and not to make fully autonomous decisions),” said Dr. Matt Wood in the post.

Large tech companies have come under fire throughout 2018 for their roles in endangering people of color and other minority groups using facial recognition and 2019 is looking to be the same as civil rights groups continue to highlight issues and technologies that could negatively impact minorities.

UPDATE:

Days after the coalition sent its letter to Amazon, the company’s shareholders filed a resolution prohibiting the sale of facial recognition products to governments and law enforcement unless it is determined that “the technology does not cause or contribute to actual or potential violations of civil and human rights” under an independent evaluation.

This version also notes that the ACLU’s settings during the facial recognition test of congress members negatively impacted results.