ProPublica recently published an article by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner called Machine Bias. In the article they reveal the prevelance of predictive models in the US justice system. These models are supposed to predict recidivism rates – the likelihood that a criminal will commit more crimes in the future. These models are used by parole boards, and in some cases by judges determining bail or even sentencing. The problem is that the models are rarely correct and inherently racist.
We talked to Jeff Larson, the data scientist for ProPublica’s investigation, about how he analyzed the data, where the company who built the racist model went wrong, and the implications of data science in criminal justice.