IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2403.16934.html
   My bibliography  Save this paper

The Costs of Competition in Distributing Scarce Research Funds

Author

Listed:
  • Gerald Schweiger
  • Adrian Barnett
  • Peter van den Besselaar
  • Lutz Bornmann
  • Andreas De Block
  • John P. A. Ioannidis
  • Ulf Sandstrom
  • Stijn Conix

Abstract

Research funding systems are not isolated systems - they are embedded in a larger scientific system with an enormous influence on the system. This paper aims to analyze the allocation of competitive research funding from different perspectives: How reliable are decision processes for funding? What are the economic costs of competitive funding? How does competition for funds affect doing risky research? How do competitive funding environments affect scientists themselves, and which ethical issues must be considered? We attempt to identify gaps in our knowledge of research funding systems; we propose recommendations for policymakers and funding agencies, including empirical experiments of decision processes and the collection of data on these processes. With our recommendations we hope to contribute to developing improved ways of organizing research funding.

Suggested Citation

  • Gerald Schweiger & Adrian Barnett & Peter van den Besselaar & Lutz Bornmann & Andreas De Block & John P. A. Ioannidis & Ulf Sandstrom & Stijn Conix, 2024. "The Costs of Competition in Distributing Scarce Research Funds," Papers 2403.16934, arXiv.org.
  • Handle: RePEc:arx:papers:2403.16934
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2403.16934
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    2. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    3. Kyle R. Myers, 2022. "Some Tradeoffs of Competition in Grant Contests," Papers 2207.02379, arXiv.org, revised Mar 2024.
    4. van den Besselaar, Peter, 2012. "Selection committee membership: Service or self-service," Journal of Informetrics, Elsevier, vol. 6(4), pages 580-585.
    5. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    6. Veugelers, Reinhilde & Stephan, Paula & Wang, Jian, 2022. "Do funding agencies select and enable risky research: Evidence from ERC using novelty as a proxy of risk taking," CEPR Discussion Papers 17684, C.E.P.R. Discussion Papers.
    7. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    8. Joshua Ettinger & Friederike E. L. Otto & E. Lisa F. Schipper, 2021. "Storytelling can be a powerful tool for science," Nature, Nature, vol. 589(7842), pages 352-352, January.
    9. Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
    10. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom-cited influences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    11. Shalabh, 2021. "Statistical inference via data science," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 1155-1155, July.
    12. Xuan Zhen Liu & Hui Fang, 2012. "Peer review and over-competitive research funding fostering mainstream opinion to monopoly. Part II," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 607-616, February.
    13. John P. A. Ioannidis, 2011. "Fund people not projects," Nature, Nature, vol. 477(7366), pages 529-531, September.
    14. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom‐cited influences," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    15. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    16. Ted von Hippel & Courtney von Hippel, 2015. "To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-8, March.
    17. Krist Vaesen & Joel Katzav, 2017. "How much would each researcher receive if competitive government research funding were distributed equally among researchers?," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-11, September.
    18. Bornmann, Lutz & Leydesdorff, Loet & Van den Besselaar, Peter, 2010. "A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications," Journal of Informetrics, Elsevier, vol. 4(3), pages 211-220.
    19. Gowri Gopalakrishna & Gerben ter Riet & Gerko Vink & Ineke Stoop & Jelte M Wicherts & Lex M Bouter, 2022. "Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands," PLOS ONE, Public Library of Science, vol. 17(2), pages 1-16, February.
    20. Stephen A Gallo & Afton S Carpenter & David Irwin & Caitlin D McPartland & Joseph Travis & Sofie Reynders & Lisa A Thompson & Scott R Glisson, 2014. "The Validation of Peer Review through Research Impact Measures and the Implications for Funding Strategies," PLOS ONE, Public Library of Science, vol. 9(9), pages 1-9, September.
    21. Shouhuai Xu & Moti Yung & Jingguo Wang, 2021. "Seeking Foundations for the Science of Cyber Security," Information Systems Frontiers, Springer, vol. 23(2), pages 263-267, April.
    22. Caroline S. Wagner & Jeffrey Alexander, 2013. "Evaluating transformative research programmes: A case study of the NSF Small Grants for Exploratory Research programme," Research Evaluation, Oxford University Press, vol. 22(3), pages 187-197, June.
    23. Chris Woolston, 2020. "Postdoc survey reveals disenchantment with working life," Nature, Nature, vol. 587(7834), pages 505-508, November.
    24. Johan Bollen & David Crandall & Damion Junk & Ying Ding & Katy Börner, 2017. "An efficient system to fund science: from proposal review to peer-to-peer distributions," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 521-528, January.
    25. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    26. Jamie Shaw, 2023. "Peer review in funding-by-lottery: A systematic overview and expansion," Research Evaluation, Oxford University Press, vol. 32(1), pages 86-100.
    27. Elaine Howard Ecklund & Anne E Lincoln, 2011. "Scientists Want More Children," PLOS ONE, Public Library of Science, vol. 6(8), pages 1-4, August.
    28. Jean J. Wang & Sarah X. Shao & Fred Y. Ye, 2021. "Identifying 'seed' papers in sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 6001-6011, July.
    29. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    30. Peter van den Besselaar & Ulf Sandström, 2019. "Measuring researcher independence using bibliometric data: A proposal for a new performance indicator," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-20, March.
    31. Sandström, Ulf & Van den Besselaar, Peter, 2018. "Funding, evaluation, and the performance of national research systems," Journal of Informetrics, Elsevier, vol. 12(1), pages 365-384.
    32. Link, Albert N. & Swann, Christopher A. & Bozeman, Barry, 2008. "A time allocation study of university faculty," Economics of Education Review, Elsevier, vol. 27(4), pages 363-374, August.
    33. Qi Wang & Ulf Sandström, 2015. "Defining the role of cognitive distance in the peer review process with an explorative study of a grant scheme in infection biology," Research Evaluation, Oxford University Press, vol. 24(3), pages 271-281.
    34. Johan S. G. Chu & James A. Evans, 2021. "Slowed canonical progress in large fields of science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 118(41), pages 2021636118-, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2022. "Using the Leiden Rankings as a Heuristics: Evidence from Italian universities in the European landscape," LEM Papers Series 2022/08, Laboratory of Economics and Management (LEM), Sant'Anna School of Advanced Studies, Pisa, Italy.
    2. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    3. Thomas, Duncan Andrew & Ramos-Vielba, Irene, 2022. "Reframing study of research(er) funding towards configurations and trails," SocArXiv uty2v, Center for Open Science.
    4. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2023. "A heuristic approach based on Leiden rankings to identify outliers: evidence from Italian universities in the European landscape," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 483-510, January.
    5. Zhuanlan Sun & C. Clark Cao & Sheng Liu & Yiwei Li & Chao Ma, 2024. "Behavioral consequences of second-person pronouns in written communications between authors and reviewers of scientific papers," Nature Communications, Nature, vol. 15(1), pages 1-12, December.
    6. Liyin Zhang & Yuchen Qian & Chao Ma & Jiang Li, 2023. "Continued collaboration shortens the transition period of scientists who move to another institution," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1765-1784, March.
    7. Jiang, Zhuoren & Lin, Tianqianjin & Huang, Cui, 2023. "Deep representation learning of scientific paper reveals its potential scholarly impact," Journal of Informetrics, Elsevier, vol. 17(1).
    8. Michael Färber & Melissa Coutinho & Shuzhou Yuan, 2023. "Biases in scholarly recommender systems: impact, prevalence, and mitigation," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(5), pages 2703-2736, May.
    9. Malte Hückstädt, 2023. "Ten reasons why research collaborations succeed—a random forest approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1923-1950, March.
    10. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    11. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    12. Kevin W. Boyack & Caleb Smith & Richard Klavans, 2018. "Toward predicting research proposal success," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 449-461, February.
    13. Weihua Li & Sam Zhang & Zhiming Zheng & Skyler J. Cranmer & Aaron Clauset, 2022. "Untangling the network effects of productivity and prominence among scientists," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    14. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    15. Katchanov, Yurij L. & Markova, Yulia V. & Shmatko, Natalia A., 2023. "Uncited papers in the structure of scientific communication," Journal of Informetrics, Elsevier, vol. 17(2).
    16. Kyle Myers & Wei Yang Tham, 2023. "Money, Time, and Grant Design," Papers 2312.06479, arXiv.org.
    17. Liang, Zhentao & Ba, Zhichao & Mao, Jin & Li, Gang, 2023. "Research complexity increases with scientists’ academic age: Evidence from library and information science," Journal of Informetrics, Elsevier, vol. 17(1).
    18. Manuel Goyanes & Márton Demeter & Aurea Grané & Tamás Tóth & Homero Gil Zúñiga, 2023. "Research patterns in communication (2009–2019): testing female representation and productivity differences, within the most cited authors and the field," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 137-156, January.
    19. Eitan Frachtenberg, 2022. "Multifactor Citation Analysis over Five Years: A Case Study of SIGMETRICS Papers," Publications, MDPI, vol. 10(4), pages 1-16, December.
    20. Marek Kwiek & Wojciech Roszka, 2022. "Academic vs. biological age in research on academic careers: a large-scale study with implications for scientifically developing systems," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3543-3575, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2403.16934. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.