IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v2y2008i3p217-228.html
   My bibliography  Save this article

Latent Markov modeling applied to grant peer review

Author

Listed:
  • Bornmann, Lutz
  • Mutz, Rüdiger
  • Daniel, Hans-Dieter

Abstract

In the grant peer review process we can distinguish various evaluation stages in which assessors judge applications on a rating scale. Research on the grant peer review process that considers its multi-stage character scarcely exists. In this study we analyze 1954 applications for doctoral and post-doctoral fellowships from the Boehringer Ingelheim Fonds (B.I.F.), which are evaluated in three stages (first: evaluation by an external reviewer; second: internal evaluation by a staff member; third: final decision by the B.I.F. Board of Trustees). The results of a latent Markov model (in combination with latent class analysis) show that a fellowship application has a chance of approval only if it is recommended for support already in the first evaluation stage, that is, if the external reviewer's evaluation is positive. Based on these results, a form of triage or pre-screening of applications seems desirable.

Suggested Citation

  • Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "Latent Markov modeling applied to grant peer review," Journal of Informetrics, Elsevier, vol. 2(3), pages 217-228.
  • Handle: RePEc:eee:infome:v:2:y:2008:i:3:p:217-228
    DOI: 10.1016/j.joi.2008.05.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157708000278
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2008.05.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    2. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    3. Lutz Bornmann & Ruediger Mutz & Hans-Dieter Daniel, 2007. "Row-column (RC) association model applied to grant peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 73(2), pages 139-147, November.
    4. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    5. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Convergent validation of peer review decisions using the h index," Journal of Informetrics, Elsevier, vol. 1(3), pages 204-213.
    6. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Committee peer review at an international research foundation: predictive validity and fairness of selection decisions on post-graduate fellowship applications," Research Evaluation, Oxford University Press, vol. 14(1), pages 15-20, April.
    7. Lutz Bornmann & Hans-Dieter Daniel, 2006. "Selecting scientific excellence through committee peer review - A citation analysis of publications previously published to approval or rejection of post-doctoral research fellowship applicants," Scientometrics, Springer;Akadémiai Kiadó, vol. 68(3), pages 427-440, September.
    8. Lutz Bornmann & Hans-Dieter Daniel, 2006. "Potential sources of bias in research fellowship assessments: effects of university prestige and field of study," Research Evaluation, Oxford University Press, vol. 15(3), pages 209-219, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    2. Lutz Bornmann & Hanna Herich & Hanna Joos & Hans-Dieter Daniel, 2012. "In public peer review of submitted manuscripts, how do reviewer comments differ from comments written by interested members of the scientific community? A content analysis of comments written for Atmo," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 915-929, December.
    3. Lutz Bornmann & Rüdiger Mutz & Hans-Dieter Daniel, 2009. "The influence of the applicants’ gender on the modeling of a peer review process by using latent Markov models," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(2), pages 407-411, November.
    4. Linton, Jonathan D., 2016. "Improving the Peer review process: Capturing more information and enabling high-risk/high-return research," Research Policy, Elsevier, vol. 45(9), pages 1936-1938.
    5. F. Bartolucci & A. Farcomeni & F. Pennoni, 2014. "Latent Markov models: a review of a general framework for the analysis of longitudinal data with covariates," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 23(3), pages 433-465, September.
    6. Andrea Bonaccorsi & Luca Secondi, 2017. "The determinants of research performance in European universities: a large scale multilevel analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1147-1178, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2008. "How to detect indications of potential sources of bias in peer review: A generalized latent variable modeling approach exemplified by a gender study," Journal of Informetrics, Elsevier, vol. 2(4), pages 280-287.
    2. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    3. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Gatekeepers of science—Effects of external reviewers’ attributes on the assessments of fellowship applications," Journal of Informetrics, Elsevier, vol. 1(1), pages 83-91.
    4. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    5. Stefan Hornbostel & Susan Böhmer & Bernd Klingsporn & Jörg Neufeld & Markus Ins, 2009. "Funding of young scientist and scientific excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(1), pages 171-190, April.
    6. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    7. Bornmann, Lutz & Daniel, Hans-Dieter, 2007. "Convergent validation of peer review decisions using the h index," Journal of Informetrics, Elsevier, vol. 1(3), pages 204-213.
    8. Materia, V.C. & Pascucci, S. & Kolympiris, C., 2015. "Understanding the selection processes of public research projects in agriculture: The role of scientific merit," Food Policy, Elsevier, vol. 56(C), pages 87-99.
    9. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2015. "Testing for the fairness and predictive validity of research funding decisions: A multilevel multiple imputation for missing data approach using ex-ante and ex-post peer evaluation data from the Austr," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2321-2339, November.
    10. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    11. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    12. Mario Paolucci & Francisco Grimaldo, 2014. "Mechanism change in a simulation of peer review: from junk support to elitism," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(3), pages 663-688, June.
    13. Marsh, Herbert W. & Jayasinghe, Upali W. & Bond, Nigel W., 2011. "Gender differences in peer reviews of grant applications: A substantive-methodological synergy in support of the null hypothesis model," Journal of Informetrics, Elsevier, vol. 5(1), pages 167-180.
    14. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    15. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    16. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    17. Qurat-ul Ain & Hira Riaz & Muhammad Tanvir Afzal, 2019. "Evaluation of h-index and its citation intensity based variants in the field of mathematics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 187-211, April.
    18. Azzurra Ragone & Katsiaryna Mirylenka & Fabio Casati & Maurizio Marchese, 2013. "On peer review in computer science: analysis of its effectiveness and suggestions for improvement," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 317-356, November.
    19. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182, January.
    20. Alexandre Rodrigues Oliveira & Carlos Fernando Mello, 2016. "Importance and susceptibility of scientific productivity indicators: two sides of the same coin," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 697-722, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:2:y:2008:i:3:p:217-228. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.