IDEAS home Printed from https://ideas.repec.org/a/taf/amstat/v76y2022i2p188-198.html
   My bibliography  Save this article

A Survey of Bias in Machine Learning Through the Prism of Statistical Parity

Author

Listed:
  • Philippe Besse
  • Eustasio del Barrio
  • Paula Gordaliza
  • Jean-Michel Loubes
  • Laurent Risser

Abstract

Applications based on machine learning models have now become an indispensable part of the everyday life and the professional world. As a consequence, a critical question has recently arose among the population: Do algorithmic decisions convey any type of discrimination against specific groups of population or minorities? In this article, we show the importance of understanding how bias can be introduced into automatic decisions. We first present a mathematical framework for the fair learning problem, specifically in the binary classification setting. We then propose to quantify the presence of bias by using the standard disparate impact index on the real and well-known adult income dataset. Finally, we check the performance of different approaches aiming to reduce the bias in binary classification outcomes. Importantly, we show that some intuitive methods are ineffective with respect to the statistical parity criterion. This sheds light on the fact that trying to make fair machine learning models may be a particularly challenging task, in particular when the training observations contain some bias.

Suggested Citation

  • Philippe Besse & Eustasio del Barrio & Paula Gordaliza & Jean-Michel Loubes & Laurent Risser, 2022. "A Survey of Bias in Machine Learning Through the Prism of Statistical Parity," The American Statistician, Taylor & Francis Journals, vol. 76(2), pages 188-198, April.
  • Handle: RePEc:taf:amstat:v:76:y:2022:i:2:p:188-198
    DOI: 10.1080/00031305.2021.1952897
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1080/00031305.2021.1952897
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1080/00031305.2021.1952897?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:amstat:v:76:y:2022:i:2:p:188-198. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/UTAS20 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.