On Wednesday, San Francisco announced District Attorney George Gascón’s plans to implement a new artificial intelligence tool to stop bias when charging people with crimes. According to the San Francisco Examiner, the tool will be implemented on July 1.

The bias mitigation tool works by removing identifying information from police reports. Anything that could hint at a person’s race will be taken out, including descriptions of eye and hair color.

However, it’s not just physical descriptors that could imply somebody’s race. The program will remove peoples’ names, locations, and neighborhoods from reports too.  A spokesperson for the DA also told the Verge that details about police officers will be taken out. 

The bias mitigation tool was developed by Alex Chohlas-Wood and a team at the Stanford Computational Policy Lab. The Verge noted that the same developer also came up with the New York Police Department’s Patternizr System.

AI is generally known for introducing or reinforcing biases within the criminal justice system. For example, a 2016 ProPublica study found that an AI software that was used to determine someone’s likelihood of committing future crimes was racially biased.

The software, COMPAS, was more likely to incorrectly identify a Black defendant as being more prone to recidivism than somebody who was white. That’s alarming because the program was used to help determine sentencing, bail, etc.

However, taking a “color-blind” approach to tackling anti-Blackness and other forms of racial bias in policing isn’t a solution either. For example, San Francisco’s AI tool fails to account for officers who are known to blatantly exhibit racial biases.

In addition, issues with policing exist at structural levels. Black neighborhoods are historically over-policed. Laws themselves are often created to target vulnerable populations. San Francisco, for example, is infamous for criminalizing the homeless.

A 2015 study by the Policy Advocacy Clinic of the U.C. Berkeley School of Law found that California cities have nearly 500 laws to restrict and criminalize activity associated with being homeless.

Removing racial descriptors from police reports doesn’t confront the ways in which biases are embedded in policing as an institution. It’s not simply a matter of a prosecutor reading a police report and handing out harsher sentences to people of color because even that decision is influenced by years of societal conditioning.

San Francisco may be touting its new bias mitigation tool as the first in the nation, but it’s unlikely that it will actually change much.