IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0014331.html
   My bibliography  Save this article

A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants

Author

Listed:
  • Lutz Bornmann
  • Rüdiger Mutz
  • Hans-Dieter Daniel

Abstract

Background: This paper presents the first meta-analysis for the inter-rater reliability (IRR) of journal peer reviews. IRR is defined as the extent to which two or more independent reviews of the same scientific document agree. Methodology/Principal Findings: Altogether, 70 reliability coefficients (Cohen's Kappa, intra-class correlation [ICC], and Pearson product-moment correlation [r]) from 48 studies were taken into account in the meta-analysis. The studies were based on a total of 19,443 manuscripts; on average, each study had a sample size of 311 manuscripts (minimum: 28, maximum: 1983). The results of the meta-analysis confirmed the findings of the narrative literature reviews published to date: The level of IRR (mean ICC/r2 = .34, mean Cohen's Kappa = .17) was low. To explain the study-to-study variation of the IRR coefficients, meta-regression analyses were calculated using seven covariates. Two covariates that emerged in the meta-regression analyses as statistically significant to gain an approximate homogeneity of the intra-class correlations indicated that, firstly, the more manuscripts that a study is based on, the smaller the reported IRR coefficients are. Secondly, if the information of the rating system for reviewers was reported in a study, then this was associated with a smaller IRR coefficient than if the information was not conveyed. Conclusions/Significance: Studies that report a high level of IRR are to be considered less credible than those with a low level of IRR. According to our meta-analysis the IRR of peer assessments is quite limited and needs improvement (e.g., reader system).

Suggested Citation

  • Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
  • Handle: RePEc:plo:pone00:0014331
    DOI: 10.1371/journal.pone.0014331
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0014331
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0014331&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0014331?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Steven Wooding & Thed N Van Leeuwen & Sarah Parks & Shitij Kapur & Jonathan Grant, 2015. "UK Doubles Its “World-Leading” Research in Life Sciences and Medicine in Six Years: Testing the Claim?," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-10, July.
    2. Lei Li & Yan Wang & Guanfeng Liu & Meng Wang & Xindong Wu, 2015. "Context-Aware Reviewer Assignment for Trust Enhanced Peer Review," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-28, June.
    3. Teplitskiy, Misha & Acuna, Daniel & Elamrani-Raoult, Aïda & Körding, Konrad & Evans, James, 2018. "The sociology of scientific validity: How professional networks shape judgement in peer review," Research Policy, Elsevier, vol. 47(9), pages 1825-1841.
    4. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    5. Jue Ni & Zhenyue Zhao & Yupo Shao & Shuo Liu & Wanlin Li & Yaoze Zhuang & Junmo Qu & Yu Cao & Nayuan Lian & Jiang Li, 2021. "The influence of opening up peer review on the citations of journal articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9393-9404, December.
    6. Rüdiger Mutz & Tobias Wolbring & Hans-Dieter Daniel, 2017. "The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2139-2153, September.
    7. David A. M. Peterson, 2020. "Dear Reviewer 2: Go F’ Yourself," Social Science Quarterly, Southwestern Social Science Association, vol. 101(4), pages 1648-1652, July.
    8. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    9. Peter Vaz da Fonseca & Andrea Decourt Savelli & Michele Nascimento Juca, 2020. "A Systematic Review of the Influence of Taxation on Corporate Capital Structure," International Journal of Economics & Business Administration (IJEBA), International Journal of Economics & Business Administration (IJEBA), vol. 0(2), pages 155-178.
    10. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    11. Bradford Demarest & Guo Freeman & Cassidy R. Sugimoto, 2014. "The reviewer in the mirror: examining gendered and ethnicized notions of reciprocity in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 717-735, October.
    12. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
    13. Vincent Chandler, 2019. "Identifying emerging scholars: seeing through the crystal ball of scholarship selection committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 39-56, July.
    14. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    15. Augusteijn, Hilde Elisabeth Maria & Wicherts, Jelte M. & Sijtsma, Klaas & van Assen, Marcel A. L. M., 2023. "Quality assessment of scientific manuscripts in peer review and education," OSF Preprints 7dc6a, Center for Open Science.
    16. Hans van Dijk & Marino van Zelst, 2020. "Comfortably Numb? Researchers’ Satisfaction with the Publication System and a Proposal for Radical Change," Publications, MDPI, vol. 8(1), pages 1-20, March.
    17. Laura Muñoz-Bermejo & Jorge Pérez-Gómez & Fernando Manzano & Daniel Collado-Mateo & Santos Villafaina & José C Adsuar, 2019. "Reliability of isokinetic knee strength measurements in children: A systematic review and meta-analysis," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-15, December.
    18. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    19. John Jerrim, 2019. "Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications," DoQSS Working Papers 19-05, Quantitative Social Science - UCL Social Research Institute, University College London.
    20. Grażyna Wieczorkowska & Katarzyna Kowalczyk, 2021. "Ensuring Sustainable Evaluation: How to Improve Quality of Evaluating Grant Proposals?," Sustainability, MDPI, vol. 13(5), pages 1-11, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0014331. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.