IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0230118.html
   My bibliography  Save this article

Assessing health research grant applications: A retrospective comparative review of a one-stage versus a two-stage application assessment process

Author

Listed:
  • Ben Morgan
  • Ly-Mee Yu
  • Tom Solomon
  • Sue Ziebland

Abstract

Background: Research funders use a wide variety of application assessment processes yet there is little evidence on their relative advantages and disadvantages. A broad distinction can be made between processes with a single stage assessment of full proposals and those that first invite an outline, with full proposals invited at a second stage only for those which are shortlisted. This paper examines the effects of changing from a one-stage to a two-stage process within the UK’s National Institute for Health Research’s (NIHR) Research for Patient Benefit (RfPB) Programme which made this change in 2015. Methods: A retrospective comparative design was used to compare eight one-stage funding competitions (912 applications) with eight two-stage funding competitions (1090 applications). Comparisons were made between the number of applications submitted, number of peer and lay reviews required, the duration of the funding round, average external peer review scores, and the total costs involved. Results: There was a mean number of 114 applications per funding round for the one-stage process and 136 for the two-stage process. The one-stage process took a mean of 274 days and the two-stage process 348 days to complete, although those who were not funded (i.e. the majority) were informed at a mean of 195 days (mean 79 days earlier) under the two-stage process. The mean peer review score for full applications using the one-stage process was 6.46 and for the two-stage process 6.82 (5.6% difference using a 1–10 scale (with 10 being the highest), but there was no significant difference between the lay reviewer scores. The one-stage process required a mean of 423 peer reviews and 102 lay reviewers and the two-stage process required a mean of 208 peer reviews and 50 lay reviews (mean difference of 215 peer reviews and 52 lay reviews) per funding round. Overall cost per funding round changed from £148,908 for the one-stage process to £105,342 for the two-stage process saving approximately £43,566 per round. Conclusion: We conclude that a two-stage application process increases the number of applications submitted to a funding round, is less burdensome and more efficient for all those involved with the process, is cost effective and has a small increase in peer reviewer scores. For the addition of fewer than 11 weeks to the process substantial efficiencies are gained which benefit funders, applicants and science. Funding agencies should consider adopting a two-stage application assessment process.

Suggested Citation

  • Ben Morgan & Ly-Mee Yu & Tom Solomon & Sue Ziebland, 2020. "Assessing health research grant applications: A retrospective comparative review of a one-stage versus a two-stage application assessment process," PLOS ONE, Public Library of Science, vol. 15(3), pages 1-18, March.
  • Handle: RePEc:plo:pone00:0230118
    DOI: 10.1371/journal.pone.0230118
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0230118
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0230118&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0230118?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eliseo Reategui & Alause Pires & Michel Carniato & Sergio Roberto Kieling Franco, 2020. "Evaluation of Brazilian research output in education: confronting international and national contexts," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 427-444, October.
    2. Ginther, Donna K. & Heggeness, Misty L., 2020. "Administrative discretion in scientific funding: Evidence from a prestigious postdoctoral training program✰," Research Policy, Elsevier, vol. 49(4).
    3. Győrffy, Balázs & Herman, Péter & Szabó, István, 2020. "Research funding: past performance is a stronger predictor of future scientific output than reviewer scores," Journal of Informetrics, Elsevier, vol. 14(3).
    4. Stephen Gallo & Lisa Thompson & Karen Schmaling & Scott Glisson, 2018. "Risk evaluation in peer review of grant applications," Environment Systems and Decisions, Springer, vol. 38(2), pages 216-229, June.
    5. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    6. Jacqueline N. Lane & Misha Teplitskiy & Gary Gray & Hardeep Ranu & Michael Menietti & Eva C. Guinan & Karim R. Lakhani, 2022. "Conservatism Gets Funded? A Field Experiment on the Role of Negative Information in Novel Project Evaluation," Management Science, INFORMS, vol. 68(6), pages 4478-4495, June.
    7. Donna K. Ginther & Misty L. Heggeness, 2020. "Administrative Discretion in Scientific Funding: Evidence from a Prestigious Postdoctoral Training Program," NBER Working Papers 26841, National Bureau of Economic Research, Inc.
    8. Joshua Krieger & Ramana Nanda & Ian Hunt & Aimee Reynolds & Peter Tarsa, 2022. "Scoring and Funding Breakthrough Ideas: Evidence from a Global Pharmaceutical Company," Harvard Business School Working Papers 23-014, Harvard Business School, revised Nov 2023.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0230118. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.