IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2101.02701.html
   My bibliography  Save this paper

Does double-blind peer-review reduce bias? Evidence from a top computer science conference

Author

Listed:
  • Mengyi Sun
  • Jainabou Barry Danfa
  • Misha Teplitskiy

Abstract

Peer review is widely regarded as essential for advancing scientific research. However, reviewers may be biased by authors' prestige or other characteristics. Double-blind peer review, in which the authors' identities are masked from the reviewers, has been proposed as a way to reduce reviewer bias. Although intuitive, evidence for the effectiveness of double-blind peer review in reducing bias is limited and mixed. Here, we examine the effects of double-blind peer review on prestige bias by analyzing the peer review files of 5027 papers submitted to the International Conference on Learning Representations (ICLR), a top computer science conference that changed its reviewing policy from single-blind peer review to double-blind peer review in 2018. We find that after switching to double-blind review, the scores given to the most prestigious authors significantly decreased. However, because many of these papers were above the threshold for acceptance, the change did not affect paper acceptance decisions significantly. Nevertheless, we show that double-blind peer review may have improved the quality of the selections by limiting other (non-author-prestige) biases. Specifically, papers rejected in the single-blind format are cited more than those rejected under the double-blind format, suggesting that double-blind review better identifies poorer quality papers. Interestingly, an apparently unrelated change - the change of rating scale from 10 to 4 points - likely reduced prestige bias significantly, to an extent that affected papers' acceptance. These results provide some support for the effectiveness of double-blind review in reducing prestige bias, while opening new research directions on the impact of peer review formats.

Suggested Citation

  • Mengyi Sun & Jainabou Barry Danfa & Misha Teplitskiy, 2021. "Does double-blind peer-review reduce bias? Evidence from a top computer science conference," Papers 2101.02701, arXiv.org.
  • Handle: RePEc:arx:papers:2101.02701
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2101.02701
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    2. Vincent Larivière & Yves Gingras, 2010. "The impact factor's Matthew Effect: A natural experiment in bibliometrics," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(2), pages 424-427, February.
    3. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    4. Vincent Larivière & Éric Archambault & Yves Gingras, 2008. "Long‐term variations in the aging of scientific literature: From exponential growth to steady‐state science (1900–2004)," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(2), pages 288-296, January.
    5. Vincent Larivière & Yves Gingras, 2010. "The impact factor's Matthew Effect: A natural experiment in bibliometrics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(2), pages 424-427, February.
    6. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    7. Blank, Rebecca M, 1991. "The Effects of Double-Blind versus Single-Blind Reviewing: Experimental Evidence from The American Economic Review," American Economic Review, American Economic Association, vol. 81(5), pages 1041-1067, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Teplitskiy, Misha & Duede, Eamon & Menietti, Michael & Lakhani, Karim R., 2022. "How status of research papers affects the way they are read and cited," Research Policy, Elsevier, vol. 51(4).
    2. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mengyi Sun & Jainabou Barry Danfa & Misha Teplitskiy, 2022. "Does double‐blind peer review reduce bias? Evidence from a top computer science conference," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(6), pages 811-819, June.
    2. Vicente Safón, 2019. "Inter-ranking reputational effects: an analysis of the Academic Ranking of World Universities (ARWU) and the Times Higher Education World University Rankings (THE) reputational relationship," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 897-915, November.
    3. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    4. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    5. David Card & Stefano DellaVigna, 2017. "What do Editors Maximize? Evidence from Four Leading Economics Journals," NBER Working Papers 23282, National Bureau of Economic Research, Inc.
    6. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.
    7. Laura Hospido & Carlos Sanz, 2021. "Gender Gaps in the Evaluation of Research: Evidence from Submissions to Economics Conferences," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 83(3), pages 590-618, June.
    8. Bradford Demarest & Guo Freeman & Cassidy R. Sugimoto, 2014. "The reviewer in the mirror: examining gendered and ethnicized notions of reciprocity in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 717-735, October.
    9. Rui Dai & Lawrence Donohue & Qingyi (Freda) Drechsler & Wei Jiang, 2023. "Dissemination, Publication, and Impact of Finance Research: When Novelty Meets Conventionality," Review of Finance, European Finance Association, vol. 27(1), pages 79-141.
    10. Siler, Kyle & Larivière, Vincent, 2022. "Who games metrics and rankings? Institutional niches and journal impact factor inflation," Research Policy, Elsevier, vol. 51(10).
    11. Zhang, Xinyuan & Xie, Qing & Song, Min, 2021. "Measuring the impact of novelty, bibliometric, and academic-network factors on citation count using a neural network," Journal of Informetrics, Elsevier, vol. 15(2).
    12. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    13. Marco Seeber & Alberto Bacchelli, 2017. "Does single blind peer review hinder newcomers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 567-585, October.
    14. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    15. Laura Hospido & Carlos Sanz, 2019. "Gender gaps in the evaluation of research: evidence from submissions to economics conferences (Updated March 2020)," Working Papers 1918, Banco de España, revised Mar 2020.
    16. Osterloh, Margit & Frey, Bruno S., 2020. "How to avoid borrowed plumes in academia," Research Policy, Elsevier, vol. 49(1).
    17. Groen-Xu, Moqi & Bös, Gregor & Teixeira, Pedro A. & Voigt, Thomas & Knapp, Bernhard, 2023. "Short-term incentives of research evaluations: Evidence from the UK Research Excellence Framework," Research Policy, Elsevier, vol. 52(6).
    18. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, April.
    19. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    20. Dell'Anno, Roberto & Caferra, Rocco & Morone, Andrea, 2020. "A “Trojan Horse” in the peer-review process of fee-charging economic journals," Journal of Informetrics, Elsevier, vol. 14(3).

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2101.02701. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.