IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0155876.html
   My bibliography  Save this article

Bias in Research Grant Evaluation Has Dire Consequences for Small Universities

Author

Listed:
  • Dennis L Murray
  • Douglas Morris
  • Claude Lavoie
  • Peter R Leavitt
  • Hugh MacIsaac
  • Michael E J Masson
  • Marc-Andre Villard

Abstract

Federal funding for basic scientific research is the cornerstone of societal progress, economy, health and well-being. There is a direct relationship between financial investment in science and a nation’s scientific discoveries, making it a priority for governments to distribute public funding appropriately in support of the best science. However, research grant proposal success rate and funding level can be skewed toward certain groups of applicants, and such skew may be driven by systemic bias arising during grant proposal evaluation and scoring. Policies to best redress this problem are not well established. Here, we show that funding success and grant amounts for applications to Canada’s Natural Sciences and Engineering Research Council (NSERC) Discovery Grant program (2011–2014) are consistently lower for applicants from small institutions. This pattern persists across applicant experience levels, is consistent among three criteria used to score grant proposals, and therefore is interpreted as representing systemic bias targeting applicants from small institutions. When current funding success rates are projected forward, forecasts reveal that future science funding at small schools in Canada will decline precipitously in the next decade, if skews are left uncorrected. We show that a recently-adopted pilot program to bolster success by lowering standards for select applicants from small institutions will not erase funding skew, nor will several other post-evaluation corrective measures. Rather, to support objective and robust review of grant applications, it is necessary for research councils to address evaluation skew directly, by adopting procedures such as blind review of research proposals and bibliometric assessment of performance. Such measures will be important in restoring confidence in the objectivity and fairness of science funding decisions. Likewise, small institutions can improve their research success by more strongly supporting productive researchers and developing competitive graduate programming opportunities.

Suggested Citation

  • Dennis L Murray & Douglas Morris & Claude Lavoie & Peter R Leavitt & Hugh MacIsaac & Michael E J Masson & Marc-Andre Villard, 2016. "Bias in Research Grant Evaluation Has Dire Consequences for Small Universities," PLOS ONE, Public Library of Science, vol. 11(6), pages 1-19, June.
  • Handle: RePEc:plo:pone00:0155876
    DOI: 10.1371/journal.pone.0155876
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0155876
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0155876&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0155876?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Does the Committee Peer Review Select the Best Applicants for Funding? An Investigation of the Selection Process for Two European Molecular Biology Organization Programmes," PLOS ONE, Public Library of Science, vol. 3(10), pages 1-11, October.
    2. Mushin Lee & Kiyong Om & Joon Koh, 2000. "The Bias of Sighted Reviewers in Research Proposal Evaluation: A Comparative Analysis of Blind and Open Review in Korea," Scientometrics, Springer;Akadémiai Kiadó, vol. 48(1), pages 99-116, June.
    3. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    4. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    5. Heinze, Thomas & Shapira, Philip & Rogers, Juan D. & Senker, Jacqueline M., 2009. "Organizational and institutional influences on creativity in scientific research," Research Policy, Elsevier, vol. 38(4), pages 610-623, May.
    6. David A. King, 2004. "The scientific impact of nations," Nature, Nature, vol. 430(6997), pages 311-316, July.
    7. Day, Theodore Eugene, 2015. "The big consequences of small biases: A simulation of peer review," Research Policy, Elsevier, vol. 44(6), pages 1266-1270.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. van Dalen, Hendrik Peter, 2020. "How the Publish-or-Perish Principle Divides a Science : The Case of Academic Economists," Discussion Paper 2020-020, Tilburg University, Center for Economic Research.
    2. Gill, Chelsea & Mehrotra, Vishal & Moses, Olayinka & Bui, Binh, 2023. "The impact of the pitching research framework on AFAANZ grant applications," Pacific-Basin Finance Journal, Elsevier, vol. 77(C).
    3. Franklin G. Mixon & Kamal P. Upadhyaya, 2024. "When forgiveness beats permission: Exploring the scholarly ethos of clinical faculty in economics," American Journal of Economics and Sociology, Wiley Blackwell, vol. 83(1), pages 75-91, January.
    4. Marco Cozzi, 2020. "Public Funding of Research and Grant Proposals in the Social Sciences: Empirical Evidence from Canada," Department Discussion Papers 1809, Department of Economics, University of Victoria.
    5. Marta Entradas & Martin W Bauer & Colm O'Muircheartaigh & Frank Marcinkowski & Asako Okamura & Giuseppe Pellegrini & John Besley & Luisa Massarani & Pedro Russo & Anthony Dudo & Barbara Saracino & Car, 2020. "Public communication by research institutes compared across countries and sciences: Building capacity for engagement or competing for visibility?," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-17, July.
    6. Li, Heyang & Wu, Meijun & Wang, Yougui & Zeng, An, 2022. "Bibliographic coupling networks reveal the advantage of diversification in scientific projects," Journal of Informetrics, Elsevier, vol. 16(3).
    7. Fulya Y. Ersoy & Jennifer Pate, 2023. "Invisible hurdles: Gender and institutional differences in the evaluation of economics papers," Economic Inquiry, Western Economic Association International, vol. 61(4), pages 777-797, October.
    8. Marta Entradas & João M. Santos, 2021. "Returns of research funding are maximised in media visibility for excellent institutes," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-8, December.
    9. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    10. Grażyna Wieczorkowska & Katarzyna Kowalczyk, 2021. "Ensuring Sustainable Evaluation: How to Improve Quality of Evaluating Grant Proposals?," Sustainability, MDPI, vol. 13(5), pages 1-11, March.
    11. Walker, James & Brewster, Chris & Fontinha, Rita & Haak-Saheem, Washika & Benigni, Stefano & Lamperti, Fabio & Ribaudo, Dalila, 2022. "The unintended consequences of the pandemic on non-pandemic research activities," Research Policy, Elsevier, vol. 51(1).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Seeber, Marco & Alon, Ilan & Pina, David G. & Piro, Fredrik Niclas & Seeber, Michele, 2022. "Predictors of applying for and winning an ERC Proof-of-Concept grant: An automated machine learning model," Technological Forecasting and Social Change, Elsevier, vol. 184(C).
    2. Thomas Feliciani & Junwen Luo & Lai Ma & Pablo Lucas & Flaminio Squazzoni & Ana Marušić & Kalpana Shankar, 2019. "A scoping review of simulation models of peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 555-594, October.
    3. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    4. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    5. C. Sean Burns & Charles W. Fox, 2017. "Language and socioeconomics predict geographic variation in peer review outcomes at an ecology journal," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1113-1127, November.
    6. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, February.
    7. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    8. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    9. Elena Veretennik & Maria Yudkevich, 2023. "Inconsistent quality signals: evidence from the regional journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3675-3701, June.
    10. Meyer, Matthias & Waldkirch, Rüdiger W. & Duscher, Irina & Just, Alexander, 2018. "Drivers of citations: An analysis of publications in “top” accounting journals," CRITICAL PERSPECTIVES ON ACCOUNTING, Elsevier, vol. 51(C), pages 24-46.
    11. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    12. David Card & Stefano DellaVigna, 2017. "What do Editors Maximize? Evidence from Four Leading Economics Journals," NBER Working Papers 23282, National Bureau of Economic Research, Inc.
    13. J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2016. "Why the referees’ reports I receive as an editor are so much better than the reports I receive as an author?," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 967-986, March.
    14. Dietmar Wolfram & Peiling Wang & Adam Hembree & Hyoungjoo Park, 2020. "Open peer review: promoting transparency in open science," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1033-1051, November.
    15. Andrada Elena Urda-Cîmpean & Sorana D. Bolboacă & Andrei Achimaş-Cadariu & Tudor Cătălin Drugan, 2016. "Knowledge Production in Two Types of Medical PhD Routes—What’s to Gain?," Publications, MDPI, vol. 4(2), pages 1-16, June.
    16. Rosa Rodriguez-Sánchez & J. A. García & J. Fdez-Valdivia, 2018. "Editorial decisions with informed and uninformed reviewers," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 25-43, October.
    17. Randa Alsabahi, 2022. "English Medium Publications: Opening or Closing Doors to Authors with Non-English Language Backgrounds," English Language Teaching, Canadian Center of Science and Education, vol. 15(10), pages 1-18, October.
    18. Yuetong Chen & Hao Wang & Baolong Zhang & Wei Zhang, 2022. "A method of measuring the article discriminative capacity and its distribution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3317-3341, June.
    19. Qianjin Zong & Yafen Xie & Jiechun Liang, 2020. "Does open peer review improve citation count? Evidence from a propensity score matching analysis of PeerJ," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 607-623, October.
    20. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0155876. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.