IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0048509.html
   My bibliography  Save this article

Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach

Author

Listed:
  • Rüdiger Mutz
  • Lutz Bornmann
  • Hans-Dieter Daniel

Abstract

Background: One of the most important weaknesses of the peer review process is that different reviewers’ ratings of the same grant proposal typically differ. Studies on the inter-rater reliability of peer reviews mostly report only average values across all submitted proposals. But inter-rater reliabilities can vary depending on the scientific discipline or the requested grant sum, for instance. Goal: Taking the Austrian Science Fund (FWF) as an example, we aimed to investigate empirically the heterogeneity of inter-rater reliabilities (intraclass correlation) and its determinants. Methods: The data consisted of N = 8,329 proposals with N = 23,414 overall ratings by reviewers, which were statistically analyzed using the generalized estimating equations approach (GEE). Results: We found an overall intraclass correlation (ICC) of reviewer? ratings of ρ = .259 with a 95% confidence interval of [.249,.279]. In humanities the ICCs were statistically significantly higher than in all other research areas except technical sciences. The ICC in biosciences deviated statistically significantly from the average ICC. Other factors (besides the research areas), such as the grant sum requested, had negligible influence on the ICC. Conclusions: Especially in biosciences, the number of reviewers of each proposal should be increased so as to increase the ICC.

Suggested Citation

  • Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
  • Handle: RePEc:plo:pone00:0048509
    DOI: 10.1371/journal.pone.0048509
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0048509
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0048509&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0048509?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. David Gurwitz & Elena Milanesi & Thomas Koenig, 2014. "Grant Application Review: The Case of Transparency," PLOS Biology, Public Library of Science, vol. 12(12), pages 1-6, December.
    2. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    3. Lawson, Cornelia & Geuna, Aldo & Finardi, Ugo, 2021. "The funding-productivity-gender nexus in science, a multistage analysis," Research Policy, Elsevier, vol. 50(3).
    4. Bayindir, Esra Eren & Gurdal, Mehmet Yigit & Saglam, Ismail, 2019. "A Game Theoretic Approach to Peer Review of Grant Proposals," Journal of Informetrics, Elsevier, vol. 13(4).
    5. Rüdiger Mutz & Tobias Wolbring & Hans-Dieter Daniel, 2017. "The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2139-2153, September.
    6. Patrícia Martinková & František Bartoš & Marek Brabec, 2023. "Assessing Inter-rater Reliability With Heterogeneous Variance Components Models: Flexible Approach Accounting for Contextual Variables," Journal of Educational and Behavioral Statistics, , vol. 48(3), pages 349-383, June.
    7. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    8. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    9. Patrícia Martinková & Dan Goldhaber & Elena Erosheva, 2018. "Disparities in ratings of internal and external applicants: A case for model-based inter-rater reliability," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-17, October.
    10. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    11. Augusteijn, Hilde Elisabeth Maria & Wicherts, Jelte M. & Sijtsma, Klaas & van Assen, Marcel A. L. M., 2023. "Quality assessment of scientific manuscripts in peer review and education," OSF Preprints 7dc6a, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Grażyna Wieczorkowska & Katarzyna Kowalczyk, 2021. "Ensuring Sustainable Evaluation: How to Improve Quality of Evaluating Grant Proposals?," Sustainability, MDPI, vol. 13(5), pages 1-11, March.
    2. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    3. Steven Wooding & Thed N Van Leeuwen & Sarah Parks & Shitij Kapur & Jonathan Grant, 2015. "UK Doubles Its “World-Leading” Research in Life Sciences and Medicine in Six Years: Testing the Claim?," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-10, July.
    4. Laura Muñoz-Bermejo & Jorge Pérez-Gómez & Fernando Manzano & Daniel Collado-Mateo & Santos Villafaina & José C Adsuar, 2019. "Reliability of isokinetic knee strength measurements in children: A systematic review and meta-analysis," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-15, December.
    5. David A. M. Peterson, 2020. "Dear Reviewer 2: Go F’ Yourself," Social Science Quarterly, Southwestern Social Science Association, vol. 101(4), pages 1648-1652, July.
    6. Vincent Chandler, 2019. "Identifying emerging scholars: seeing through the crystal ball of scholarship selection committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 39-56, July.
    7. Lei Li & Yan Wang & Guanfeng Liu & Meng Wang & Xindong Wu, 2015. "Context-Aware Reviewer Assignment for Trust Enhanced Peer Review," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-28, June.
    8. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    9. Pengfei Jia & Weixi Xie & Guangyao Zhang & Xianwen Wang, 2023. "Do reviewers get their deserved acknowledgments from the authors of manuscripts?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5687-5703, October.
    10. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    11. Bradford Demarest & Guo Freeman & Cassidy R. Sugimoto, 2014. "The reviewer in the mirror: examining gendered and ethnicized notions of reciprocity in peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 717-735, October.
    12. Peter Vaz da Fonseca & Andrea Decourt Savelli & Michele Nascimento Juca, 2020. "A Systematic Review of the Influence of Taxation on Corporate Capital Structure," International Journal of Economics & Business Administration (IJEBA), International Journal of Economics & Business Administration (IJEBA), vol. 0(2), pages 155-178.
    13. Hans van Dijk & Marino van Zelst, 2020. "Comfortably Numb? Researchers’ Satisfaction with the Publication System and a Proposal for Radical Change," Publications, MDPI, vol. 8(1), pages 1-20, March.
    14. Teplitskiy, Misha & Acuna, Daniel & Elamrani-Raoult, Aïda & Körding, Konrad & Evans, James, 2018. "The sociology of scientific validity: How professional networks shape judgement in peer review," Research Policy, Elsevier, vol. 47(9), pages 1825-1841.
    15. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    16. Augusteijn, Hilde Elisabeth Maria & Wicherts, Jelte M. & Sijtsma, Klaas & van Assen, Marcel A. L. M., 2023. "Quality assessment of scientific manuscripts in peer review and education," OSF Preprints 7dc6a, Center for Open Science.
    17. Rüdiger Mutz & Tobias Wolbring & Hans-Dieter Daniel, 2017. "The effect of the “very important paper” (VIP) designation in Angewandte Chemie International Edition on citation impact: A propensity score matching analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2139-2153, September.
    18. Jue Ni & Zhenyue Zhao & Yupo Shao & Shuo Liu & Wanlin Li & Yaoze Zhuang & Junmo Qu & Yu Cao & Nayuan Lian & Jiang Li, 2021. "The influence of opening up peer review on the citations of journal articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9393-9404, December.
    19. John Jerrim, 2019. "Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications," DoQSS Working Papers 19-05, Quantitative Social Science - UCL Social Research Institute, University College London.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0048509. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.