IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0003480.html
   My bibliography  Save this article

Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes

Author

Listed:
  • Lutz Bornmann
  • Gerlind Wallon
  • Anna Ledin

Abstract

Does peer review fulfill its declared objective of identifying the best science and the best scientists? In order to answer this question we analyzed the Long-Term Fellowship and the Young Investigator programmes of the European Molecular Biology Organization. Both programmes aim to identify and support the best post doctoral fellows and young group leaders in the life sciences. We checked the association between the selection decisions and the scientific performance of the applicants. Our study involved publication and citation data for 668 applicants to the Long-Term Fellowship programme from the year 1998 (130 approved, 538 rejected) and 297 applicants to the Young Investigator programme (39 approved and 258 rejected applicants) from the years 2001 and 2002. If quantity and impact of research publications are used as a criterion for scientific achievement, the results of (zero-truncated) negative binomial models show that the peer review process indeed selects scientists who perform on a higher level than the rejected ones subsequent to application. We determined the extent of errors due to over-estimation (type I errors) and under-estimation (type 2 errors) of future scientific performance. Our statistical analyses point out that between 26% and 48% of the decisions made to award or reject an application show one of both error types. Even though for a part of the applicants, the selection committee did not correctly estimate the applicant's future performance, the results show a statistically significant association between selection decisions and the applicants' scientific achievements, if quantity and impact of research publications are used as a criterion for scientific achievement.

Suggested Citation

  • Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
  • Handle: RePEc:plo:pone00:0003480
    DOI: 10.1371/journal.pone.0003480
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0003480
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0003480&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0003480?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    2. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2010. "The h index research output measurement: Two approaches to enhance its accuracy," Journal of Informetrics, Elsevier, vol. 4(3), pages 407-414.
    3. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    4. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    5. Annita Nugent & Ho Fai Chan & Uwe Dulleck, 2022. "Government funding of university-industry collaboration: exploring the impact of targeted funding on university patent activity," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 29-73, January.
    6. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    7. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    8. Bornmann, Lutz & Leydesdorff, Loet & Van den Besselaar, Peter, 2010. "A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications," Journal of Informetrics, Elsevier, vol. 4(3), pages 211-220.
    9. Hendy Abdoul & Christophe Perrey & Philippe Amiel & Florence Tubach & Serge Gottot & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices," PLOS ONE, Public Library of Science, vol. 7(9), pages 1-15, September.
    10. Eric Libby & Leon Glass, 2010. "The Calculus of Committee Composition," PLOS ONE, Public Library of Science, vol. 5(9), pages 1-8, September.
    11. Filippo Radicchi & Claudio Castellano, 2013. "Analysis of bibliometric indicators for individual scholars in a large data set," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 627-637, December.
    12. T. S. Evans & N. Hopkins & B. S. Kaube, 2012. "Universality of performance indicators based on citation and reference counts," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(2), pages 473-495, November.
    13. Radicchi, Filippo & Weissman, Alexander & Bollen, Johan, 2017. "Quantifying perceived impact of scientific publications," Journal of Informetrics, Elsevier, vol. 11(3), pages 704-712.
    14. Irwin Feller, 2013. "Peer review and expert panels as techniques for evaluating the quality of academic research," Chapters, in: Albert N. Link & Nicholas S. Vonortas (ed.), Handbook on the Theory and Practice of Program Evaluation, chapter 5, pages 115-142, Edward Elgar Publishing.
    15. Marcin Kozak & Lutz Bornmann, 2012. "A New Family of Cumulative Indexes for Measuring Scientific Performance," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-4, October.
    16. Yu-Wei Chang & Dar-Zen Chen & Mu-Hsuan Huang, 2021. "Do extraordinary science and technology scientists balance their publishing and patenting activities?," PLOS ONE, Public Library of Science, vol. 16(11), pages 1-20, November.
    17. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    18. Hendy Abdoul & Christophe Perrey & Florence Tubach & Philippe Amiel & Isabelle Durand-Zaleski & Corinne Alberti, 2012. "Non-Financial Conflicts of Interest in Academic Grant Evaluation: A Qualitative Study of Multiple Stakeholders in France," PLOS ONE, Public Library of Science, vol. 7(4), pages 1-10, April.
    19. Bornmann, Lutz & Daniel, Hans-Dieter, 2009. "Extent of type I and type II errors in editorial decisions: A case study on Angewandte Chemie International Edition," Journal of Informetrics, Elsevier, vol. 3(4), pages 348-352.
    20. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    21. Dennis L Murray & Douglas Morris & Claude Lavoie & Peter R Leavitt & Hugh MacIsaac & Michael E J Masson & Marc-Andre Villard, 2016. "Bias in Research Grant Evaluation Has Dire Consequences for Small Universities," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-19, June.
    22. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182, January.
    23. Radicchi, Filippo & Castellano, Claudio, 2012. "Testing the fairness of citation indicators for comparison across scientific domains: The case of fractional citation counts," Journal of Informetrics, Elsevier, vol. 6(1), pages 121-130.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0003480. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.