IDEAS home Printed from https://ideas.repec.org/a/pal/palcom/v6y2020i1d10.1057_s41599-020-0412-9.html
   My bibliography  Save this article

Criteria for assessing grant applications: a systematic review

Author

Listed:
  • Sven E. Hug

    (University of Zurich)

  • Mirjam Aeschbach

    (University of Zurich)

Abstract

Criteria are an essential component of any procedure for assessing merit. Yet, little is known about the criteria peers use to assess grant applications. In this systematic review we therefore identify and synthesize studies that examine grant peer review criteria in an empirical and inductive manner. To facilitate the synthesis, we introduce a framework that classifies what is generally referred to as ‘criterion’ into an evaluated entity (i.e., the object of evaluation) and an evaluation criterion (i.e., the dimension along which an entity is evaluated). In total, the synthesis includes 12 studies on grant peer review criteria. Two-thirds of these studies examine criteria in the medical and health sciences, while studies in other fields are scarce. Few studies compare criteria across different fields, and none focus on criteria for interdisciplinary research. We conducted a qualitative content analysis of the 12 studies and thereby identified 15 evaluation criteria and 30 evaluated entities, as well as the relations between them. Based on a network analysis, we determined the following main relations between the identified evaluation criteria and evaluated entities. The aims and outcomes of a proposed project are assessed in terms of the evaluation criteria originality, academic relevance, and extra-academic relevance. The proposed research process is evaluated both on the content level (quality, appropriateness, rigor, coherence/justification), as well as on the level of description (clarity, completeness). The resources needed to implement the research process are evaluated in terms of the evaluation criterion feasibility. Lastly, the person and personality of the applicant are assessed from a ‘psychological’ (motivation, traits) and a ‘sociological’ (diversity) perspective. Furthermore, we find that some of the criteria peers use to evaluate grant applications do not conform to the fairness doctrine and the ideal of impartiality. Grant peer review could therefore be considered unfair and biased. Our findings suggest that future studies on criteria in grant peer review should focus on the applicant, include data from non-Western countries, and examine fields other than the medical and health sciences.

Suggested Citation

  • Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
  • Handle: RePEc:pal:palcom:v:6:y:2020:i:1:d:10.1057_s41599-020-0412-9
    DOI: 10.1057/s41599-020-0412-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1057/s41599-020-0412-9
    File Function: Abstract
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1057/s41599-020-0412-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hendy Abdoul & Christophe Perrey & Philippe Amiel & Florence Tubach & Serge Gottot & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices," PLOS ONE, Public Library of Science, vol. 7(9), pages 1-15, September.
    2. Pleun van Arensbergen & Inge van der Weijden & Peter van den Besselaar, 2014. "Different views on scholarly talent: What are the talents we are looking for in science?," Research Evaluation, Oxford University Press, vol. 23(4), pages 273-284.
    3. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    4. Stevan Harnad, 2009. "Open access scientometrics and the UK Research Assessment Exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(1), pages 147-156, April.
    5. Martin Reinhart, 2010. "Peer review practices: a content analysis of external reviews in science funding," Research Evaluation, Oxford University Press, vol. 19(5), pages 317-331, December.
    6. Nees Jan Eck & Ludo Waltman, 2010. "Software survey: VOSviewer, a computer program for bibliometric mapping," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(2), pages 523-538, August.
    7. Sven E. Hug & Michael Ochsner & Hans-Dieter Daniel, 2013. "Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history," Research Evaluation, Oxford University Press, vol. 22(5), pages 369-383, August.
    8. Elizabeth L. Pier & Markus Brauer & Amarette Filut & Anna Kaatz & Joshua Raclaw & Mitchell J. Nathan & Cecilia E. Ford & Molly Carnes, 2018. "Low agreement among reviewers evaluating the same NIH grant applications," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(12), pages 2952-2957, March.
    9. Oortwijn, Wija J. & Vondeling, Hindrik & van Barneveld, Teus & van Vugt, Christel & Bouter, Lex M., 2002. "Priority setting for health technology assessment in The Netherlands: principles and practice," Health Policy, Elsevier, vol. 62(3), pages 227-242, December.
    10. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    11. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    12. Mårtensson, Pär & Fors, Uno & Wallin, Sven-Bertil & Zander, Udo & Nilsson, Gunnar H, 2016. "Evaluating research: A multidisciplinary approach to assessing research practice and quality," Research Policy, Elsevier, vol. 45(3), pages 593-603.
    13. Peter van den Besselaar & Ulf Sandström & Hélène Schiffbaenker, 2018. "Studying grant decision-making: a linguistic analysis of review reports," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 313-329, October.
    14. Jochen Gläser & Grit Laudel, 2005. "Advantages and dangers of ‘remote’ peer evaluation," Research Evaluation, Oxford University Press, vol. 14(3), pages 186-198, December.
    15. Lipworth, Wendy L. & Kerridge, Ian H. & Carter, Stacy M. & Little, Miles, 2011. "Journal peer review in context: A qualitative study of the social and subjective dimensions of manuscript review in biomedical publishing," Social Science & Medicine, Elsevier, vol. 72(7), pages 1056-1063, April.
    16. Christopher W. Belter, 2016. "Citation analysis as a literature search method for systematic reviews," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2766-2777, November.
    17. Oecd, 2018. "Effective operation of competitive research funding systems," OECD Science, Technology and Industry Policy Papers 57, OECD Publishing.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Song Jing & Qingzhao Ma & Siyi Wang & Hanliang Xu & Tian Xu & Xia Guo & Zhuolin Wu, 2024. "Research on developmental evaluation based on the "four abilities" model: evidence from early career researchers in China," Quality & Quantity: International Journal of Methodology, Springer, vol. 58(1), pages 681-704, February.
    2. Emre Özel, 2024. "What is Gender Bias in Grant Peer review?," Working Papers halshs-03862027, HAL.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    2. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    3. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    4. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    5. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    6. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    7. Emre Özel, 2024. "What is Gender Bias in Grant Peer review?," Working Papers halshs-03862027, HAL.
    8. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    9. A. I. M. Jakaria Rahman & Raf Guns & Loet Leydesdorff & Tim C. E. Engels, 2016. "Measuring the match between evaluators and evaluees: cognitive distances between panel members and research groups at the journal level," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1639-1663, December.
    10. Lawrence Smolinsky & Daniel S. Sage & Aaron J. Lercher & Aaron Cao, 2021. "Citations versus expert opinions: citation analysis of featured reviews of the American Mathematical Society," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 3853-3870, May.
    11. Rebecca Abma-Schouten & Joey Gijbels & Wendy Reijmerink & Ingeborg Meijer, 2023. "Evaluation of research proposals by peer review panels: broader panels for broader assessments?," Science and Public Policy, Oxford University Press, vol. 50(4), pages 619-632.
    12. Zhichao Wang & Valentin Zelenyuk, 2021. "Performance Analysis of Hospitals in Australia and its Peers: A Systematic Review," CEPA Working Papers Series WP012021, School of Economics, University of Queensland, Australia.
    13. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, April.
    14. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    15. Andrada Elena Urda-Cîmpean & Sorana D. Bolboacă & Andrei Achimaş-Cadariu & Tudor Cătălin Drugan, 2016. "Knowledge Production in Two Types of Medical PhD Routes—What’s to Gain?," Publications, MDPI, vol. 4(2), pages 1-16, June.
    16. Randa Alsabahi, 2022. "English Medium Publications: Opening or Closing Doors to Authors with Non-English Language Backgrounds," English Language Teaching, Canadian Center of Science and Education, vol. 15(10), pages 1-18, October.
    17. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    18. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    19. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    20. David Card & Stefano DellaVigna, 2020. "What Do Editors Maximize? Evidence from Four Economics Journals," The Review of Economics and Statistics, MIT Press, vol. 102(1), pages 195-217, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pal:palcom:v:6:y:2020:i:1:d:10.1057_s41599-020-0412-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: https://www.nature.com/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.