The New York City Police Department has revealed its use of Patternizr, a new tool that lets officers quickly sort through case files, as reported by The Washington Post.

The software works by using automated pattern-recognition algorithms, allowing it to sort through “hundreds of thousands” of NYPD records. It specifically looks for similarities, or patterns, between cases. Right now, the software pretty much focuses on theft and larceny.

Looking through records to link certain crimes to each other is something that would typically require countless manual hours. Patternizr cuts down on time, which could ultimately save the department money in the long run.

In addition, The Washington Post reported that it’s a comprehensive system, looking into all of New York’s 77 precincts. That allows analysts to see what’s going on across the entire city with ease.

Patternizr has been in use since 2016, but the NYPD only publicly shared it in an issue of INFORMS Journal on Applied Analytics this month.

According to The Verge, NYPD’s assistant commission of data analytics, Evan Levine, and the former director of analytics, Alex Chohlas-Wood, said it took the department two years to develop the software. They claim the NYPD is the first to use a system like this in the United States.

Police have been working to automate their systems for years, so this development isn’t all that surprising. This program, at least, doesn’t claim to predict where or when crime will occur, like Palantir’s program that was secretly tested in New Orleans. It only works by using information that the NYPD already has in their database.

In addition, Patternizr doesn’t consider the race of suspects when looking for patterns, to reduce racial bias, as reported by The Washington Post.

However, it’s still a system that’s been in use by the NYPD for years without any members of the public knowing. This has caused some concerns for the New York Civil Liberties Union, who hasn’t been able to review the system.

NYCLU’s legal director Christopher Dunn told The Washington Post, “To ensure fairness the NYPD should be transparent about the technologies it deploys and allows independent researchers to audit these systems before they are tested on New Yorkers.”