IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v81y2009i3d10.1007_s11192-008-2220-7.html
   My bibliography  Save this article

Peer review of grant applications in biology and medicine. Reliability, fairness, and validity

Author

Listed:
  • Martin Reinhart

    (University of Basel)

Abstract

This paper examines the peer review procedure of a national science funding organization (Swiss National Science Foundation) by means of the three most frequently studied criteria reliability, fairness, and validity. The analyzed data consists of 496 applications for project-based funding from biology and medicine from the year 1998. Overall reliability is found to be fair with an intraclass correlation coefficient of 0.41 with sizeable differences between biology (0.45) and medicine (0.20). Multiple logistic regression models reveal only scientific performance indicators as significant predictors of the funding decision while all potential sources of bias (gender, age, nationality, and academic status of the applicant, requested amount of funding, and institutional surrounding) are non-significant predictors. Bibliometric analysis provides evidence that the decisions of a public funding organization for basic project-based research are in line with the future publication success of applicants. The paper also argues for an expansion of approaches and methodologies in peer review research by increasingly focusing on process rather than outcome and by including a more diverse set of methods e.g. content analysis. Such an expansion will be necessary to advance peer review research beyond the abundantly treated questions of reliability, fairness, and validity.

Suggested Citation

  • Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
  • Handle: RePEc:spr:scient:v:81:y:2009:i:3:d:10.1007_s11192-008-2220-7
    DOI: 10.1007/s11192-008-2220-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-008-2220-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-008-2220-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. L. Erik Clavería & Eliseo Guallar & Jordi Camí & José Conde & Roberto Pastor & José R. Ricoy & Eduardo Rodríguez-Farré & Fernando Ruiz-Palomo & Emilio Muñoz, 2000. "Does Peer Review Predict the Performance of Research Projects in Health Sciences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 47(1), pages 11-23, January.
    2. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    2. Alexandre Rodrigues Oliveira & Carlos Fernando Mello, 2016. "Importance and susceptibility of scientific productivity indicators: two sides of the same coin," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 697-722, November.
    3. Marco Pautasso, 2010. "Worsening file-drawer problem in the abstracts of natural, medical and social science databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(1), pages 193-202, October.
    4. Katarína Cechlárová & Tamás Fleiner & Eva Potpinková, 2014. "Assigning evaluators to research grant applications: the case of Slovak Research and Development Agency," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 495-506, May.
    5. Primož Južnič & Stojan Pečlin & Matjaž Žaucer & Tilen Mandelj & Miro Pušnik & Franci Demšar, 2010. "Scientometric indicators: peer-review, bibliometric methods and conflict of interests," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 429-441, November.
    6. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    7. Giovanni Abramo & Ciriaco D’Angelo, 2015. "An assessment of the first “scientific habilitation” for university appointments in Italy," Economia Politica: Journal of Analytical and Institutional Economics, Springer;Fondazione Edison, vol. 32(3), pages 329-357, December.
    8. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    9. Gill, Chelsea & Mehrotra, Vishal & Moses, Olayinka & Bui, Binh, 2023. "The impact of the pitching research framework on AFAANZ grant applications," Pacific-Basin Finance Journal, Elsevier, vol. 77(C).
    10. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    11. Jorge Mañana-Rodríguez & Elea Giménez-Toledo, 2013. "Scholarly publishing in social sciences and humanities, associated probabilities of belonging and its spectrum: a quantitative approach for the Spanish case," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 893-910, March.
    12. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
    13. Marjolijn N. Wijnen & Jorg J. M. Massen & Mariska E. Kret, 2021. "Gender bias in the allocation of student grants," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5477-5488, July.
    14. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.
    15. Materia, V.C. & Pascucci, S. & Kolympiris, C., 2015. "Understanding the selection processes of public research projects in agriculture: The role of scientific merit," Food Policy, Elsevier, vol. 56(C), pages 87-99.
    16. John Jerrim, 2019. "Peer-review of grant proposals. An analysis of Economic and Social Research Council grant applications," DoQSS Working Papers 19-05, Quantitative Social Science - UCL Social Research Institute, University College London.
    17. Zhongyi Wang & Keying Wang & Jiyue Liu & Jing Huang & Haihua Chen, 2022. "Measuring the innovation of method knowledge elements in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2803-2827, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alexandre Rodrigues Oliveira & Carlos Fernando Mello, 2016. "Importance and susceptibility of scientific productivity indicators: two sides of the same coin," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 697-722, November.
    2. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    3. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    4. Amin Mazloumian, 2012. "Predicting Scholars' Scientific Impact," PLOS ONE, Public Library of Science, vol. 7(11), pages 1-5, November.
    5. Flaminio Squazzoni & Károly Takács, 2011. "Social Simulation That 'Peers into Peer Review'," Journal of Artificial Societies and Social Simulation, Journal of Artificial Societies and Social Simulation, vol. 14(4), pages 1-3.
    6. Qianjin Zong & Yafen Xie & Rongchan Tuo & Jingshi Huang & Yang Yang, 2019. "The impact of video abstract on citation counts: evidence from a retrospective cohort study of New Journal of Physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1715-1727, June.
    7. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    8. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    9. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    10. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    11. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    12. Ulf Sandström & Martin Hällsten, 2008. "Persistent nepotism in peer-review," Scientometrics, Springer;Akadémiai Kiadó, vol. 74(2), pages 175-189, February.
    13. Marsh, Herbert W. & Jayasinghe, Upali W. & Bond, Nigel W., 2011. "Gender differences in peer reviews of grant applications: A substantive-methodological synergy in support of the null hypothesis model," Journal of Informetrics, Elsevier, vol. 5(1), pages 167-180.
    14. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    15. Stefan Hornbostel & Susan Böhmer & Bernd Klingsporn & Jörg Neufeld & Markus Ins, 2009. "Funding of young scientist and scientific excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(1), pages 171-190, April.
    16. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    17. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    18. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Convergent validation of peer review decisions using the h index," Journal of Informetrics, Elsevier, vol. 1(3), pages 204-213.
    19. Stephen A Gallo & Afton S Carpenter & David Irwin & Caitlin D McPartland & Joseph Travis & Sofie Reynders & Lisa A Thompson & Scott R Glisson, 2014. "The Validation of Peer Review through Research Impact Measures and the Implications for Funding Strategies," PLOS ONE, Public Library of Science, vol. 9(9), pages 1-9, September.
    20. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "Latent Markov modeling applied to grant peer review," Journal of Informetrics, Elsevier, vol. 2(3), pages 217-228.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:81:y:2009:i:3:d:10.1007_s11192-008-2220-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.