• Live
    • Audio Only
  • google plus
  • facebook
  • twitter
News > U.S.

Software That Predicts Crime is Biased Against Black People

  • Inmates walk in San Quentin state prison in San Quentin, California.

    Inmates walk in San Quentin state prison in San Quentin, California. | Photo: Reuters

Published 24 May 2016
Opinion

As law courts rely increasingly on predictive software, the results may be that of over-policing communities of color.

An algorithm used by police to predict the likelihood of criminals reoffending in the future was designed with an internal bias that incorrectly predicts that Black people are most likely to become repeat offenders, according to a recent study conducted by ProPublica.

ProPublica analyzed the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) scores assigned to people arrested in Florida between 2013 and 2014, according to their study published Monday in various news outlets.

RELATED: 
San Francisco Police Chief Resigns After Killing of Unarmed Black Woman

They found that only 20 percent of people that were flagged as being high risk of committing violent crimes within two years of being arrested actually did so. They also found that Black people were twice as to likely be incorrectly flagged as being repeat offenders.

Finally, the study found that white people were more likely to be identified as being at low risk of committing crimes in the future.

WATCH: Inside the Americas – U.S. Leads in Mass Incarceration

“We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender,” ProPublica said, as published in Fusion.

“Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind.”

In courtrooms across the U.S., these reports are marketed as an objective and fair tool that can help in determine the appropriate severity of a criminal’s sentence.

However, as ProPublica found, COMPAS also uses a series of more than 100 questions as data, with some relying on an interviewer’s perception of a person in order to answer the question.

RELATED: 
Florida Police Officer Leaves Black Unarmed Man Brain Dead 

One such example of a question like this is: based on the screener’s observations, is this person a suspected or admitted gang member?

As Fusion reported, there are people that have not even committed crimes that are being targeted by predictive software like COMPAS. PredPol, for example, uses crime statistics about certain neighborhoods in order to predict when and where crimes are likely to occur again.

Such software has increased concerns about the over-policing of communities where people of color live, as well as more systemic harassment from law enforcement officials.

Comment
0
Comments
Post with no comments.