IDEAS home Printed from https://ideas.repec.org/p/qss/dqsswp/1905.html
   My bibliography  Save this paper

Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications

Author

Listed:
  • John Jerrim

    (Department of Social Science, Institute of Education, University College London
    Department of Social Science, Institute of Education, University College London and Centre for Economic Performance, London School of Economics)

Abstract

Peer-review is widely used throughout academia, most notably in the publication of journal articles and the allocation of research grants. Yet peer-review has been subject to much criticism, including being slow, unreliable, subjective and potentially prone to bias. This paper contributes to this literature by investigating the consistency of peer-reviews and the impact they have upon a high-stakes outcome (whether a research grant is funded). Analysing data from 4,000 social science grant proposals and 15,000 reviews, this paper illustrates how the peer-review scores assigned by different reviewers have only low levels of consistency (a correlation between reviewer scores of only 0.2). Reviews provided by ‘nominated reviewers’ (i.e. reviewers selected by the grant applicant) appear to be overly generous and do not correlate with the evaluations provided by independent reviewers. Yet a positive review from a nominated reviewer is strongly linked to whether a grant is awarded. Finally, a single negative peer-review is shown to reduce the chances of a proposal being funding from around 55% to around 25% (even when it has otherwise been rated highly).

Suggested Citation

  • John Jerrim, 2019. "Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications," DoQSS Working Papers 19-05, Quantitative Social Science - UCL Social Research Institute, University College London.
  • Handle: RePEc:qss:dqsswp:1905
    as

    Download full text from publisher

    File URL: https://repec.ucl.ac.uk/REPEc/pdf/qsswp1905.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Michael Obrecht & Karl Tibelius & Guy D'Aloisio, 2007. "Examining the value added by committee discussion in the review of applications for research awards," Research Evaluation, Oxford University Press, vol. 16(2), pages 79-91, June.
    2. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2010. "A Reliability-Generalization Study of Journal Peer Reviews: A Multilevel Meta-Analysis of Inter-Rater Reliability and Its Determinants," PLOS ONE, Public Library of Science, vol. 5(12), pages 1-10, December.
    3. Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
    4. Elizabeth L. Pier & Markus Brauer & Amarette Filut & Anna Kaatz & Joshua Raclaw & Mitchell J. Nathan & Cecilia E. Ford & Molly Carnes, 2018. "Low agreement among reviewers evaluating the same NIH grant applications," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(12), pages 2952-2957, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    2. Miriam L E Steiner Davis & Tiffani R Conner & Kate Miller-Bains & Leslie Shapard, 2020. "What makes an effective grants peer reviewer? An exploratory study of the necessary skills," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-22, May.
    3. Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
    4. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    5. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    6. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    7. Elias Bouacida & Renaud Foucart, 2022. "Rituals of Reason," Working Papers 344119591, Lancaster University Management School, Economics Department.
    8. Ginther, Donna K. & Heggeness, Misty L., 2020. "Administrative discretion in scientific funding: Evidence from a prestigious postdoctoral training program✰," Research Policy, Elsevier, vol. 49(4).
    9. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.
    10. Grażyna Wieczorkowska & Katarzyna Kowalczyk, 2021. "Ensuring Sustainable Evaluation: How to Improve Quality of Evaluating Grant Proposals?," Sustainability, MDPI, vol. 13(5), pages 1-11, March.
    11. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    12. Steven Wooding & Thed N Van Leeuwen & Sarah Parks & Shitij Kapur & Jonathan Grant, 2015. "UK Doubles Its “World-Leading” Research in Life Sciences and Medicine in Six Years: Testing the Claim?," PLOS ONE, Public Library of Science, vol. 10(7), pages 1-10, July.
    13. Katarína Cechlárová & Tamás Fleiner & Eva Potpinková, 2014. "Assigning evaluators to research grant applications: the case of Slovak Research and Development Agency," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 495-506, May.
    14. Paulina Kubera & Weronika Kwiatkowska, 2021. "Challenges Related to the Implementation of State Aid Measures for Entrepreneurs Affected by the Covid-19 Pandemic," European Research Studies Journal, European Research Studies Journal, vol. 0(Special 5), pages 209-220.
    15. Marjolijn N. Wijnen & Jorg J. M. Massen & Mariska E. Kret, 2021. "Gender bias in the allocation of student grants," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5477-5488, July.
    16. Laura Muñoz-Bermejo & Jorge Pérez-Gómez & Fernando Manzano & Daniel Collado-Mateo & Santos Villafaina & José C Adsuar, 2019. "Reliability of isokinetic knee strength measurements in children: A systematic review and meta-analysis," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-15, December.
    17. van Dalen, Hendrik Peter, 2020. "How the Publish-or-Perish Principle Divides a Science : The Case of Academic Economists," Other publications TiSEM 6fbb6b92-0e06-4271-b6e7-3, Tilburg University, School of Economics and Management.
    18. Giovanni Abramo & Ciriaco D’Angelo, 2015. "An assessment of the first “scientific habilitation” for university appointments in Italy," Economia Politica: Journal of Analytical and Institutional Economics, Springer;Fondazione Edison, vol. 32(3), pages 329-357, December.
    19. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    20. David A. M. Peterson, 2020. "Dear Reviewer 2: Go F’ Yourself," Social Science Quarterly, Southwestern Social Science Association, vol. 101(4), pages 1648-1652, July.

    More about this item

    Keywords

    Peer-review; reliability; grants; scientific funding;
    All these keywords.

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:qss:dqsswp:1905. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Dr Neus Bover Fonts (email available below). General contact details of provider: https://edirc.repec.org/data/dqioeuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.