In January a Florida court ruled that Willie Allen Lynch didn’t have a right to see photos of other suspects also identified by the facial recognition search that lead to his arrest, as reported by Slate.

In 2015, undercover cops with the Jacksonville Sheriff’s Office purchased $50 of crack cocaine. They photographed the man selling it and used the Face Analysis Comparison Examination System (FACES) — which came back with Lynch and four other suspects.

Ultimately, Lynch was sentenced to eight years in prison, according to the American Civil Liberties Union. What’s alarming is that The Florida Times-Union noted the software wasn’t mentioned in Lynch’s arrest report and instead, “the Sheriff’s Office said it had identified him using a manual search of its mugshot database.”

On Monday, the ACLU filed a friend-of-the-court brief — alongside the Electronic Frontier Foundation, the Georgetown Center on Privacy and Technology, and the Innocence Project — to get Florida’s Supreme Court to address the issue.

The spread of facial recognition technology into government agencies has caused concern amongst academics, researchers, activists, and more. All of these question on how this technology can exacerbate pre-existing social issues, such as anti-Blackness and state surveillance, and led them to question if it even works.

Lynch’s case, in particular, opens up a lot of questions about using facial recognition technology in low-stakes crimes. It also questions whether or not courts even care about how well these programs work.

The data that’s out there around the inaccuracy of facial recognition technology is clear. For example, the ACLU conducted a study where the software falsely matched 28 members of Congress with mugshots — the false matches disproportionately being people of color.

Furthermore, reports have shown significant errors in systems like Amazon’s Rekognition. A study from researchers at MIT and the University of Toronto found Rekognition tends to mistake dark-skinned women for men. Despite that, Rekognition has been sold to law enforcement and marketed to the Department of Homeland Security’s Immigration and Customs Enforcement (ICE).

With the government using these less-than-accurate facial recognition systems, citizens have a constitutional right to look into a system’s accuracy before being convicted based on any of its results. In Florida, the appeals court involved in Lynch’s case may be setting a dangerous precedent.

Law enforcement officials in Florida began using their facial recognition system in 2001, long before most states, and conduct about 8,000 searches on FACES per month, according to Slate. However, the 2016 Georgetown’s Center on Privacy and Technology report showed major facial recognition systems — including FACES — haven’t been audited.

FACES looks through a database of more than 33 million driver’s licenses and law enforcement photos. Most Americans are in a facial recognition database, even if they don’t realize it, but The Florida Times-Union noted Florida’s was the country’s biggest and most active.

Despite this, Jacksonville Sheriff’s Office has no formal policies around how it uses FACES, according to The Florida Times-Union.

The anti-Black history of facial recognition systems has to be accounted for with Lynch’s case. Artificial intelligence often fails to recognize Black people, especially if they’re darker-skinned. This has even been seen with concerns around self-driving cars.

“Lynch is a Black man, and the quality of the cop’s photos wasn’t fit for Instagram, much less taking a man’s liberty,” the ACLU wrote in their blog post. “Moreover, in Lynch’s case, the algorithm expressed only one star of confidence that it had generated the correct match.”

According to the ACLU, despite Florida’s massive use of the system, Lynch’s case seems to be the only one challenging it. Still, that doesn’t mean the system is completely accurate — because, after all, it’s never been tested.

“Prosecutorial misconduct and police adoption of face recognition technology are dangerous, and the ACLU has been pushing to halt both,” the ACLU wrote. “Until that happens, prosecutors must give defendants full access to information about the algorithm used against them in places where face recognition technology has already been deployed. “