Civil Rights Groups Say Using Algorithms In Risk-Based Bail Assessments Is A Problem
Photo Credit: Symbol of law and justice in the empty courtroom, law and justice concept.

Civil Rights Groups Say Using Algorithms In Risk-Based Bail Assessments Is A Problem

The Advanced Civil Liberties Union (ACLU) and National Association for the Advancement of Colored People (NAACP), Color of Change, MoveOn and 115 other groups signed a statement of concern about the use of pretrial “risk assessment” algorithms to determine bail.

The amount of state and local governments turning to algorithms to determine pretrial flight and criminal recidivism risks. The algorithm was intended to remove bias from pretrial hearings to reduce jail overcrowding, etc. But in the statement, the groups explain that our justice system is still disproportionately impacting communities of color.  Because of this, the data entered reflects those flaws.

This makes what might be intended to be objective or neutral actually biased tools that will continue to add to the discrepancies in the justice system. The organizations are also calling for the government to follow principles, including the following:

  • If in use, a pretrial risk assessment instrument must be designed and implemented in ways that reduce and ultimately eliminate unwarranted racial disparities across the criminal justice system.
  • If in use, a pretrial risk assessment instrument must be transparent, independently validated, and open to challenge by an accused person’s counsel. The design and structure of such tools must be transparent and accessible to the public.
  • If in use, a pretrial risk assessment instrument must communicate the likelihood of success upon release in clear, concrete terms.

And more. Check out all the principles in their statement and learn more about legislation regarding risk-based bail assessment algorithms here.