IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v123y2020i1d10.1007_s11192-020-03348-1.html
   My bibliography  Save this article

Arbitrariness in the peer review process

Author

Listed:
  • Elise S. Brezis

    (Bar-Ilan University)

  • Aliaksandr Birukou

    (Springer-Verlag GmbH
    Peoples’ Friendship University of Russia (RUDN University))

Abstract

The purpose of this paper is to analyze the causes and effects of arbitrariness in the peer review process. This paper focuses on two main reasons for the arbitrariness in peer review. The first is that referees are not homogenous and display homophily in their taste and perception of innovative ideas. The second element is that reviewers are different in the time they allocate for peer review. Our model replicates the NIPS experiment of 2014, showing that the ratings of peer review are not robust, and that altering reviewers leads to a dramatic impact on the ranking of the papers. This paper also shows that innovative works are not highly ranked in the existing peer review process, and in consequence are often rejected.

Suggested Citation

  • Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
  • Handle: RePEc:spr:scient:v:123:y:2020:i:1:d:10.1007_s11192-020-03348-1
    DOI: 10.1007/s11192-020-03348-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03348-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03348-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Elise S Brezis, 2007. "Focal randomisation: An optimal mechanism for the evaluation of R&D projects," Science and Public Policy, Oxford University Press, vol. 34(10), pages 691-698, December.
    2. Flaminio Squazzoni & Elise Brezis & Ana Marušić, 2017. "Scientometrics of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 501-502, October.
    3. Elizabeth L. Pier & Markus Brauer & Amarette Filut & Anna Kaatz & Joshua Raclaw & Mitchell J. Nathan & Cecilia E. Ford & Molly Carnes, 2018. "Low agreement among reviewers evaluating the same NIH grant applications," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(12), pages 2952-2957, March.
    4. Michail Kovanis & Ludovic Trinquart & Philippe Ravaud & Raphaël Porcher, 2017. "Evaluating alternative systems of peer review: a large-scale agent-based modelling approach to scientific publication," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 651-671, October.
    5. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    6. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    7. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    8. Christoph Bartneck, 2017. "Reviewers’ scores do not predict impact: bibliometric analysis of the proceedings of the human–robot interaction conference," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 179-194, January.
    9. Terttu Luukkonen, 2012. "Conservatism and risk-taking in peer review: Emerging ERC practices," Research Evaluation, Oxford University Press, vol. 21(1), pages 48-60, February.
    10. Linton, Jonathan D., 2016. "Improving the Peer review process: Capturing more information and enabling high-risk/high-return research," Research Policy, Elsevier, vol. 45(9), pages 1936-1938.
    11. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    2. Carol Nash, 2023. "Roles and Responsibilities for Peer Reviewers of International Journals," Publications, MDPI, vol. 11(2), pages 1-24, June.
    3. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    4. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    5. Tirthankar Ghosal & Sandeep Kumar & Prabhat Kumar Bharti & Asif Ekbal, 2022. "Peer review analyze: A novel benchmark resource for computational analysis of peer reviews," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    6. Jibang Wu & Haifeng Xu & Yifan Guo & Weijie Su, 2023. "A Truth Serum for Eliciting Self-Evaluations in Scientific Reviews," Papers 2306.11154, arXiv.org, revised Feb 2024.
    7. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    2. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    3. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.
    4. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    5. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    6. Elias Bouacida & Renaud Foucart, 2022. "Rituals of Reason," Working Papers 344119591, Lancaster University Management School, Economics Department.
    7. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    8. Albert Banal-Estañol & Ines Macho-Stadler & David Pérez-Castrillo, 2016. "Key Success Drivers in Public Research Grants: Funding the Seeds of Radical Innovation in Academia?," CESifo Working Paper Series 5852, CESifo.
    9. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    10. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    11. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    12. Marco Ottaviani, 2020. "Grantmaking," Working Papers 672, IGIER (Innocenzo Gasparini Institute for Economic Research), Bocconi University.
    13. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    14. Stephen Gallo & Lisa Thompson & Karen Schmaling & Scott Glisson, 2018. "Risk evaluation in peer review of grant applications," Environment Systems and Decisions, Springer, vol. 38(2), pages 216-229, June.
    15. Christian Catalini & Christian Fons-Rosen & Patrick Gaulé, 2020. "How Do Travel Costs Shape Collaboration?," Management Science, INFORMS, vol. 66(8), pages 3340-3360, August.
    16. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    17. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    18. Michaël Bikard, 2020. "Idea twins: Simultaneous discoveries as a research tool," Strategic Management Journal, Wiley Blackwell, vol. 41(8), pages 1528-1543, August.
    19. van Dalen, Hendrik Peter, 2020. "How the Publish-or-Perish Principle Divides a Science : The Case of Academic Economists," Discussion Paper 2020-020, Tilburg University, Center for Economic Research.
    20. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.

    More about this item

    Keywords

    Arbitrariness; Homophily; Peer review; Innovation;
    All these keywords.

    JEL classification:

    • D73 - Microeconomics - - Analysis of Collective Decision-Making - - - Bureaucracy; Administrative Processes in Public Organizations; Corruption
    • G01 - Financial Economics - - General - - - Financial Crises
    • G18 - Financial Economics - - General Financial Markets - - - Government Policy and Regulation
    • L51 - Industrial Organization - - Regulation and Industrial Policy - - - Economics of Regulation

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:123:y:2020:i:1:d:10.1007_s11192-020-03348-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.