IDEAS home Printed from https://ideas.repec.org/a/nas/journl/v115y2018p2952-2957.html
   My bibliography  Save this article

Low agreement among reviewers evaluating the same NIH grant applications

Author

Listed:
  • Elizabeth L. Pier

    (Center for Women’s Health Research, University of Wisconsin–Madison, Madison, WI 53715; Department of Educational Psychology, University of Wisconsin–Madison, Madison, WI 53706)

  • Markus Brauer

    (Department of Psychology, University of Wisconsin–Madison, Madison, WI 53706)

  • Amarette Filut

    (Center for Women’s Health Research, University of Wisconsin–Madison, Madison, WI 53715)

  • Anna Kaatz

    (Center for Women’s Health Research, University of Wisconsin–Madison, Madison, WI 53715)

  • Joshua Raclaw

    (Center for Women’s Health Research, University of Wisconsin–Madison, Madison, WI 53715; Department of English, West Chester University, West Chester, PA 19383)

  • Mitchell J. Nathan

    (Department of Educational Psychology, University of Wisconsin–Madison, Madison, WI 53706)

  • Cecilia E. Ford

    (Center for Women’s Health Research, University of Wisconsin–Madison, Madison, WI 53715; Department of English, University of Wisconsin–Madison, Madison, WI 53706; Department of Sociology, University of Wisconsin–Madison, Madison, WI 53706)

  • Molly Carnes

    (Center for Women’s Health Research, University of Wisconsin–Madison, Madison, WI 53715; Department of Medicine, University of Wisconsin–Madison, Madison, WI 53792)

Abstract

Obtaining grant funding from the National Institutes of Health (NIH) is increasingly competitive, as funding success rates have declined over the past decade. To allocate relatively scarce funds, scientific peer reviewers must differentiate the very best applications from comparatively weaker ones. Despite the importance of this determination, little research has explored how reviewers assign ratings to the applications they review and whether there is consistency in the reviewers’ evaluation of the same application. Replicating all aspects of the NIH peer-review process, we examined 43 individual reviewers’ ratings and written critiques of the same group of 25 NIH grant applications. Results showed no agreement among reviewers regarding the quality of the applications in either their qualitative or quantitative evaluations. Although all reviewers received the same instructions on how to rate applications and format their written critiques, we also found no agreement in how reviewers “translated” a given number of strengths and weaknesses into a numeric rating. It appeared that the outcome of the grant review depended more on the reviewer to whom the grant was assigned than the research proposed in the grant. This research replicates the NIH peer-review process to examine in detail the qualitative and quantitative judgments of different reviewers examining the same application, and our results have broad relevance for scientific grant peer review.

Suggested Citation

  • Elizabeth L. Pier & Markus Brauer & Amarette Filut & Anna Kaatz & Joshua Raclaw & Mitchell J. Nathan & Cecilia E. Ford & Molly Carnes, 2018. "Low agreement among reviewers evaluating the same NIH grant applications," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(12), pages 2952-2957, March.
  • Handle: RePEc:nas:journl:v:115:y:2018:p:2952-2957
    as

    Download full text from publisher

    File URL: http://www.pnas.org/content/115/12/2952.full
    Download Restriction: no
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Elias Bouacida & Renaud Foucart, 2020. "The acceptability of lotteries in allocation problems," Working Papers 301646245, Lancaster University Management School, Economics Department.
    2. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    3. van Dalen, Hendrik Peter, 2020. "How the Publish-or-Perish Principle Divides a Science : The Case of Academic Economists," Discussion Paper 2020-020, Tilburg University, Center for Economic Research.
    4. Paulina Kubera & Weronika Kwiatkowska, 2021. "Challenges Related to the Implementation of State Aid Measures for Entrepreneurs Affected by the Covid-19 Pandemic," European Research Studies Journal, European Research Studies Journal, vol. 0(Special 5), pages 209-220.
    5. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    6. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    7. Miriam L E Steiner Davis & Tiffani R Conner & Kate Miller-Bains & Leslie Shapard, 2020. "What makes an effective grants peer reviewer? An exploratory study of the necessary skills," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-22, May.
    8. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    9. Ginther, Donna K. & Heggeness, Misty L., 2020. "Administrative discretion in scientific funding: Evidence from a prestigious postdoctoral training program✰," Research Policy, Elsevier, vol. 49(4).
    10. José Luis Ortega, 2022. "Classification and analysis of PubPeer comments: How a web journal club is used," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(5), pages 655-670, May.
    11. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.
    12. Rovetta, Alessandro & Castaldo, Lucia, 2022. "Are We Sure We Fully Understand What an Infodemic Is? A Global Perspective on Infodemiological Problems," SocArXiv xw723, Center for Open Science.
    13. Donna K. Ginther & Misty L. Heggeness, 2020. "Administrative Discretion in Scientific Funding: Evidence from a Prestigious Postdoctoral Training Program," NBER Working Papers 26841, National Bureau of Economic Research, Inc.
    14. Gregoire Mariethoz & Frédéric Herman & Amelie Dreiss, 2021. "Reply to the comment by Heyard et al. titled “Imaginary carrot or effective fertiliser? A rejoinder on funding and productivity”," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 9339-9342, November.
    15. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    16. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    17. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    18. John Jerrim, 2019. "Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications," DoQSS Working Papers 19-05, Quantitative Social Science - UCL Social Research Institute, University College London.
    19. Elias Bouacida & Renaud Foucart, 2022. "Rituals of Reason," Working Papers 344119591, Lancaster University Management School, Economics Department.
    20. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nas:journl:v:115:y:2018:p:2952-2957. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Eric Cain (email available below). General contact details of provider: http://www.pnas.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.