Interventions Over Predictions: Reframing the Ethical Debate for Actuarial Risk Assessment

Innovation and Economic Growth

Article Snapshot

Author(s)

Chelsea Barabas, Karthik Dinakar, Joichi Ito, Madars Virza and Jonathan Zittrain

Source

Proceedings of Machine Learning Research, Vol. 81, pp. 62-76, 2018

Summary

Machine learning is now being used in the criminal justice system. Because these systems focus on accuracy of prediction and ignore factors that drive crime, they can exacerbate problems of mass incarceration and inequality.

Policy Relevance

Data-driven models should be used to compare effective and ineffective interventions. Such models could reduce crime and increase fairness.

Main Points

  • Many commentators debate whether machine learning tools used in criminal justice are racially biased.
     
    • The defendants cannot see how scores are calculated.
       
    • Human bias can enter these systems when the system is designed, or when the scores are applied.
       
  • Such tools are sometimes used when considering whether a defendant should be released before trial; the criminal justice system is moving away from an “ability to pay” model (bail) to a risk-based approach.
     
    • Some argue the goal of such a system should be to ensure the system makes accurate predictions for all or some groups of defendants.
       
    • Some argue that such systems should be designed to avoid a disparate impact on some groups of defendants.
       
  • In the big picture, policymakers must consider whether the main goal of such systems should be predictive, or diagnostic.
     
  • In the 1970s and 1980s, policymakers assumed that interventions designed to rehabilitate criminals were ineffective; they ignored the “ratchet effect,” that is, the tendency of involvement with the criminal justice to increase crime by weakening defendant’s family ties and reducing the chance defendants would find jobs.
     
  • Machine learning systems based on the existing system will amplify the problem of the “ratchet effect” if they focus on accurate predictions, but fail to investigate the underlying causes of crime, or try to identify effective interventions.
     
  • Traditional regression analysis confuses outcomes with causal drivers; for example, intensive policing causes resentment of the police and increases crime, but resentment of the police is not a cause of crime.
     
  • Better systems adopt a “causal inference framework” to compare potential outcomes.
     
    • Randomized control trials randomly assign criminals to different interventions, revealing which are effective.
       
    • Observational data analysis can be used when randomized trials are not possible.
       
  • Machine learning should be used as it is used in medicine, to identify underlying drivers of crime and evaluate the effectiveness of intervention.
     

 

Get The Article

Find the full article online

Search for Full Article

Share