IDEAS home Printed from https://ideas.repec.org/p/ces/ceswps/_11877.html
   My bibliography  Save this paper

Racial Implications of Police-Algorithm Interactions: Evidence from Rearrest Predictions

Author

Listed:
  • Yong Suk Lee

Abstract

This paper examines the racial implications of police interaction with algorithms, particularly in the context of racial disparities in rearrest predictions. Our experimental study involved showing police officers the profiles of young offenders and asking them to predict rearrest probabilities within three years, first without and then after seeing the algorithm’s assessment. The experiment varied the visibility of the offender’s race (revealed to one group, hidden in another group, and mixed (some shown and some hidden) in the other group). Additionally, we explored how informing officers about the model’s accuracy affected their responses. Our findings indicate that officers adjust their predictions towards the algorithm’s assessment when the race of the profile is disclosed. However, these adjustments exhibit significant racial disparities, with a significant gap in initial rearrest predictions between Black and White offenders even when all observable characteristics are controlled for. Furthermore, only Black officers significantly reduced their predictions after viewing the the algorithm’s assessments, while White officers did not. Our findings reveal the limited and nuanced effectiveness of algorithms in reducing bias in recidivism predictions, underscoring the complexities of algorithm-assisted human judgment in criminal justice.

Suggested Citation

  • Yong Suk Lee, 2025. "Racial Implications of Police-Algorithm Interactions: Evidence from Rearrest Predictions," CESifo Working Paper Series 11877, CESifo.
  • Handle: RePEc:ces:ceswps:_11877
    as

    Download full text from publisher

    File URL: https://www.ifo.de/DocDL/cesifo1_wp11877.pdf
    Download Restriction: no
    ---><---

    More about this item

    Keywords

    human-computer interaction; artificial intelligence; algorithmic prediction; racial bias; criminal justice;
    All these keywords.

    JEL classification:

    • C10 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - General
    • D63 - Microeconomics - - Welfare Economics - - - Equity, Justice, Inequality, and Other Normative Criteria and Measurement
    • K40 - Law and Economics - - Legal Procedure, the Legal System, and Illegal Behavior - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:ces:ceswps:_11877. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Klaus Wohlrabe (email available below). General contact details of provider: https://edirc.repec.org/data/cesifde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.