IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0232327.html
   My bibliography  Save this article

What makes an effective grants peer reviewer? An exploratory study of the necessary skills

Author

Listed:
  • Miriam L E Steiner Davis
  • Tiffani R Conner
  • Kate Miller-Bains
  • Leslie Shapard

Abstract

This exploratory mixed methods study describes skills required to be an effective peer reviewer as a member of review panels conducted for federal agencies that fund research, and examines how reviewer experience and the use of technology within such panels impacts reviewer skill development. Two specific review panel formats are considered: in-person face-to-face and virtual video conference. Data were collected through interviews with seven program officers and five expert peer review panelists, and surveys from 51 respondents. Results include the skills reviewers’ consider necessary for effective review panel participation, their assessment of the relative importance of these skills, how they are learned, and how review format affects skill development and improvement. Results are discussed relative to the peer review literature and with consideration of the importance of professional skills needed by successful scientists and peer reviewers.

Suggested Citation

  • Miriam L E Steiner Davis & Tiffani R Conner & Kate Miller-Bains & Leslie Shapard, 2020. "What makes an effective grants peer reviewer? An exploratory study of the necessary skills," PLOS ONE, Public Library of Science, vol. 15(5), pages 1-22, May.
  • Handle: RePEc:plo:pone00:0232327
    DOI: 10.1371/journal.pone.0232327
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0232327
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0232327&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0232327?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Michael Obrecht & Karl Tibelius & Guy D'Aloisio, 2007. "Examining the value added by committee discussion in the review of applications for research awards," Research Evaluation, Oxford University Press, vol. 16(2), pages 79-91, June.
    2. Michael R Martin & Andrea Kopstein & Joy M Janice, 2010. "An Analysis of Preliminary and Post-Discussion Priority Scores for Grant Applications Peer Reviewed by the Center for Scientific Review at the NIH," PLOS ONE, Public Library of Science, vol. 5(11), pages 1-6, November.
    3. Thijs Bol & Mathijs de Vaan & Arnout van de Rijt, 2018. "The Matthew effect in science funding," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(19), pages 4887-4890, May.
    4. Elizabeth L. Pier & Markus Brauer & Amarette Filut & Anna Kaatz & Joshua Raclaw & Mitchell J. Nathan & Cecilia E. Ford & Molly Carnes, 2018. "Low agreement among reviewers evaluating the same NIH grant applications," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(12), pages 2952-2957, March.
    5. Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    2. Stephen A Gallo & Afton S Carpenter & Scott R Glisson, 2013. "Teleconference versus Face-to-Face Scientific Peer Review of Grant Application: Effects on Review Outcomes," PLOS ONE, Public Library of Science, vol. 8(8), pages 1-9, August.
    3. Ginther, Donna K. & Heggeness, Misty L., 2020. "Administrative discretion in scientific funding: Evidence from a prestigious postdoctoral training program✰," Research Policy, Elsevier, vol. 49(4).
    4. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    5. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    6. John Jerrim, 2019. "Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications," DoQSS Working Papers 19-05, Quantitative Social Science - UCL Social Research Institute, University College London.
    7. Lu, Wei & Ren, Yan & Huang, Yong & Bu, Yi & Zhang, Yuehan, 2021. "Scientific collaboration and career stages: An ego-centric perspective," Journal of Informetrics, Elsevier, vol. 15(4).
    8. Belén Álvarez-Bornstein & Adrián A. Díaz-Faes & María Bordons, 2019. "What characterises funded biomedical research? Evidence from a basic and a clinical domain," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 805-825, May.
    9. Corsini, Alberto & Pezzoni, Michele, 2023. "Does grant funding foster research impact? Evidence from France," Journal of Informetrics, Elsevier, vol. 17(4).
    10. Feichtinger, Gustav & Grass, Dieter & Kort, Peter M. & Seidl, Andrea, 2021. "On the Matthew effect in research careers," Journal of Economic Dynamics and Control, Elsevier, vol. 123(C).
    11. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    12. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    13. Matteo Cinelli & Giovanna Ferraro & Antonio Iovanella, 2022. "Connections matter: a proxy measure for evaluating network membership with an application to the Seventh Research Framework Programme," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(7), pages 3959-3976, July.
    14. Elias Bouacida & Renaud Foucart, 2022. "Rituals of Reason," Working Papers 344119591, Lancaster University Management School, Economics Department.
    15. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    16. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    17. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    18. Alessandro Pluchino & Alessio Emanuele Biondo & Andrea Rapisarda, 2018. "Talent Versus Luck: The Role Of Randomness In Success And Failure," Advances in Complex Systems (ACS), World Scientific Publishing Co. Pte. Ltd., vol. 21(03n04), pages 1-31, May.
    19. Conor O’Kane & Jing A. Zhang & Jarrod Haar & James A. Cunningham, 2023. "How scientists interpret and address funding criteria: value creation and undesirable side effects," Small Business Economics, Springer, vol. 61(2), pages 799-826, August.
    20. Liao, Chien Hsiang, 2021. "The Matthew effect and the halo effect in research funding," Journal of Informetrics, Elsevier, vol. 15(1).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0232327. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.