IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i3d10.1007_s11192-023-04915-y.html
   My bibliography  Save this article

How do referees integrate evaluation criteria into their overall judgment? Evidence from grant peer review

Author

Listed:
  • Sven E. Hug

    (University of Zurich)

Abstract

Little is known whether peer reviewers use the same evaluation criteria and how they integrate the criteria into their overall judgment. This study therefore proposed two assessment styles based on theoretical perspectives and normative positions. According to the case-by-case style, referees use many and different criteria, weight criteria on a case-by-case basis, and integrate criteria in a complex, non-mechanical way into their overall judgment. According to the uniform style, referees use a small fraction of the available criteria, apply the same criteria, weight the criteria in the same way, and integrate the criteria based on simple rules (i.e., fast-and-frugal heuristics). These two styles were examined using a unique dataset from a career funding scheme that contained a comparatively large number of evaluation criteria. A heuristic (fast-and-frugal trees) and a complex procedure (logistic regression) were employed to describe how referees integrate the criteria into their overall judgment. The logistic regression predicted the referees’ overall assessment with high accuracy and slightly more accurately than the fast-and-frugal trees. Overall, the results of this study support the uniform style but also indicate that the uniform style needs to be revised as follows: referees use many criteria and integrate the criteria using complex rules. However, and most importantly, the revised style could describe most—but not all—of the referees’ judgments. Future studies should therefore examine how referees’ judgments can be characterized in those cases where the uniform style failed. Moreover, the evaluation process of referees should be studied in more empirical and theoretical detail.

Suggested Citation

  • Sven E. Hug, 2024. "How do referees integrate evaluation criteria into their overall judgment? Evidence from grant peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(3), pages 1231-1253, March.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:3:d:10.1007_s11192-023-04915-y
    DOI: 10.1007/s11192-023-04915-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04915-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04915-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. David Aikman & Mirta Galesic & Gerd Gigerenzer & Sujit Kapadia & Konstantinos Katsikopoulos & Amit Kothiyal & Emma Murphy & Tobias Neumann, 2021. "Taking uncertainty seriously: simplicity versus complexity in financial regulation [Uncertainty in macroeconomic policy-making: art or science?]," Industrial and Corporate Change, Oxford University Press and the Associazione ICC, vol. 30(2), pages 317-345.
    2. Lutz Bornmann, 2015. "Complex tasks and simple solutions: The use of heuristics in the evaluation of research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(8), pages 1738-1739, August.
    3. Vladimir Batagelj & Anuška Ferligoj & Flaminio Squazzoni, 2017. "The emergence of a field: a network analysis of research on peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 503-532, October.
    4. Hren, Darko & Pina, David G. & Norman, Christopher R. & Marušić, Ana, 2022. "What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports," Journal of Informetrics, Elsevier, vol. 16(2).
    5. Phillips, Nathaniel D. & Neth, Hansjörg & Woike, Jan K. & Gaissmaier, Wolfgang, 2017. "FFTrees: A toolbox to create, visualize, and evaluate fast-and-frugal decision trees," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 12(4), pages 344-368.
    6. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    7. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    8. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    9. Matthew K Eblen & Robin M Wagner & Deepshikha RoyChowdhury & Katherine C Patel & Katrina Pearson, 2016. "How Criterion Scores Predict the Overall Impact Score and Funding Outcomes for National Institutes of Health Peer-Reviewed Applications," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-17, June.
    10. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    11. Sven E. Hug & Michael Ochsner & Hans-Dieter Daniel, 2013. "Criteria for assessing research quality in the humanities: a Delphi study among scholars of English literature, German literature and art history," Research Evaluation, Oxford University Press, vol. 22(5), pages 369-383, August.
    12. Francisco Grimaldo & Ana Marušić & Flaminio Squazzoni, 2018. "Fragments of peer review: A quantitative analysis of the literature (1969-2015)," PLOS ONE, Public Library of Science, vol. 13(2), pages 1-14, February.
    13. Phillips, Nathaniel D. & Neth, Hansjörg & Woike, Jan K. & Gaissmaier, Wolfgang, 2017. "FFTrees: A toolbox to create, visualize, and evaluate fast-and-frugal decision trees," Judgment and Decision Making, Cambridge University Press, vol. 12(4), pages 344-368, July.
    14. Ron Johnston & Kelvyn Jones & David Manley, 2018. "Confounding and collinearity in regression analysis: a cautionary tale and an alternative procedure, illustrated by studies of British voting behaviour," Quality & Quantity: International Journal of Methodology, Springer, vol. 52(4), pages 1957-1976, July.
    15. Sven E Hug & Michael Ochsner, 2022. "Do peers share the same criteria for assessing grant applications?," Research Evaluation, Oxford University Press, vol. 31(1), pages 104-117.
    16. repec:cup:judgdm:v:12:y:2017:i:4:p:344-368 is not listed on IDEAS
    17. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-11, December.
    18. Mårtensson, Pär & Fors, Uno & Wallin, Sven-Bertil & Zander, Udo & Nilsson, Gunnar H, 2016. "Evaluating research: A multidisciplinary approach to assessing research practice and quality," Research Policy, Elsevier, vol. 45(3), pages 593-603.
    19. Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
    20. Cruz-Castro, Laura & Sanz-Menendez, Luis, 2021. "What should be rewarded? Gender and evaluation criteria for tenure and promotion," Journal of Informetrics, Elsevier, vol. 15(3).
    21. Nathaniel D. Phillips & Hansjörg Neth & Jan K. Woike & Wolfgang Gaissmaier, 2017. "FFTrees: A toolbox to create, visualize, and evaluate fast-and-frugal decision trees," Judgment and Decision Making, Society for Judgment and Decision Making, vol. 12(4), pages 344-368, July.
    22. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    23. Gaëlle Vallée-Tourangeau & Ana Wheelock & Tushna Vandrevala & Priscilla Harries, 2022. "Correction: Peer reviewers’ dilemmas: a qualitative exploration of decisional conflict in the evaluation of grant applications in the medical humanities and social sciences," Palgrave Communications, Palgrave Macmillan, vol. 9(1), pages 1-1, December.
    24. Florian M. Artinger & Gerd Gigerenzer & Perke Jacobs, 2022. "Satisficing: Integrating Two Traditions," Journal of Economic Literature, American Economic Association, vol. 60(2), pages 598-635, June.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lutz Bornmann & Julian N. Marewski, 2024. "Opium in science and society: numbers and other quantifications," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5313-5346, September.
    2. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.
    3. Tóth, Tamás & Demeter, Márton & Csuhai, Sándor & Major, Zsolt Balázs, 2024. "When career-boosting is on the line: Equity and inequality in grant evaluation, productivity, and the educational backgrounds of Marie Skłodowska-Curie Actions individual fellows in social sciences an," Journal of Informetrics, Elsevier, vol. 18(2).
    4. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    5. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    6. Gabriel Okasa & Alberto de Le'on & Michaela Strinzel & Anne Jorstad & Katrin Milzow & Matthias Egger & Stefan Muller, 2024. "A Supervised Machine Learning Approach for Assessing Grant Peer Review Reports," Papers 2411.16662, arXiv.org, revised Dec 2024.
    7. Sven E. Hug & Mirjam Aeschbach, 2020. "Criteria for assessing grant applications: a systematic review," Palgrave Communications, Palgrave Macmillan, vol. 6(1), pages 1-15, December.
    8. repec:oup:rseval:v:32:y:2024:i:1:p:19-31. is not listed on IDEAS
    9. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    10. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    11. Song Jing & Qingzhao Ma & Siyi Wang & Hanliang Xu & Tian Xu & Xia Guo & Zhuolin Wu, 2024. "Research on developmental evaluation based on the "four abilities" model: evidence from early career researchers in China," Quality & Quantity: International Journal of Methodology, Springer, vol. 58(1), pages 681-704, February.
    12. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    13. Gerald Schweiger & Adrian Barnett & Peter van den Besselaar & Lutz Bornmann & Andreas De Block & John P. A. Ioannidis & Ulf Sandstrom & Stijn Conix, 2024. "The Costs of Competition in Distributing Scarce Research Funds," Papers 2403.16934, arXiv.org.
    14. Gadzinski, Gregory & Castello, Alessio, 2020. "Fast and Frugal heuristics augmented: When machine learning quantifies Bayesian uncertainty," Journal of Behavioral and Experimental Finance, Elsevier, vol. 26(C).
    15. Popoyan, Lilit & Napoletano, Mauro & Roventini, Andrea, 2017. "Taming macroeconomic instability: Monetary and macro-prudential policy interactions in an agent-based model," Journal of Economic Behavior & Organization, Elsevier, vol. 134(C), pages 117-140.
    16. Hren, Darko & Pina, David G. & Norman, Christopher R. & Marušić, Ana, 2022. "What makes or breaks competitive research proposals? A mixed-methods analysis of research grant evaluation reports," Journal of Informetrics, Elsevier, vol. 16(2).
    17. repec:hal:spmain:info:hdl:2441/5bnglqth5987gaq6dhju3psjn3 is not listed on IDEAS
    18. Francesco Lamperti & Antoine Mandel & Mauro Napoletano & Alessandro Sapio & Andrea Roventini & Tomas Balint & Igor Khorenzhenko, 2017. "Taming macroeconomic instability," SciencePo Working papers Main hal-03399574, HAL.
    19. William Forbes & Egor Kiselev & Len Skerratt, 2023. "The stability and downside risk to contrarian profits: Evidence from the S&P 500," International Journal of Finance & Economics, John Wiley & Sons, Ltd., vol. 28(1), pages 733-750, January.
    20. Joachim P. Hasebrook & Leonie Michalak & Anna Wessels & Sabine Koenig & Stefan Spierling & Stefan Kirmsse, 2022. "Green Behavior: Factors Influencing Behavioral Intention and Actual Environmental Behavior of Employees in the Financial Service Sector," Sustainability, MDPI, vol. 14(17), pages 1-35, August.
    21. Joseph, Andreas, 2019. "Parametric inference with universal function approximators," Bank of England working papers 784, Bank of England, revised 22 Jul 2020.
    22. Lilit Popoyan, 2020. "Macroprudential Policy: a Blessing or a Curse?," Review of Economics and Institutions, Università di Perugia, vol. 11(1-2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:3:d:10.1007_s11192-023-04915-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.