Get our newsletter delivered directly to your inbox
I have already subscribed | Do not show this message again
Your email has been successfully registered.
As law courts rely increasingly on predictive software, the results may be that of over-policing communities of color.
An algorithm used by police to predict the likelihood of criminals reoffending in the future was designed with an internal bias that incorrectly predicts that Black people are most likely to become repeat offenders, according to a recent study conducted by ProPublica.
ProPublica analyzed the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) scores assigned to people arrested in Florida between 2013 and 2014, according to their study published Monday in various news outlets.
They found that only 20 percent of people that were flagged as being high risk of committing violent crimes within two years of being arrested actually did so. They also found that Black people were twice as to likely be incorrectly flagged as being repeat offenders.
Finally, the study found that white people were more likely to be identified as being at low risk of committing crimes in the future.
WATCH: Inside the Americas – U.S. Leads in Mass Incarceration
“We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender,” ProPublica said, as published in Fusion.
“Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind.”
In courtrooms across the U.S., these reports are marketed as an objective and fair tool that can help in determine the appropriate severity of a criminal’s sentence.
However, as ProPublica found, COMPAS also uses a series of more than 100 questions as data, with some relying on an interviewer’s perception of a person in order to answer the question.
One such example of a question like this is: based on the screener’s observations, is this person a suspected or admitted gang member?
As Fusion reported, there are people that have not even committed crimes that are being targeted by predictive software like COMPAS. PredPol, for example, uses crime statistics about certain neighborhoods in order to predict when and where crimes are likely to occur again.
Such software has increased concerns about the over-policing of communities where people of color live, as well as more systemic harassment from law enforcement officials.