IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0183967.html
   My bibliography  Save this article

How much would each researcher receive if competitive government research funding were distributed equally among researchers?

Author

Listed:
  • Krist Vaesen
  • Joel Katzav

Abstract

Scientists are increasingly dissatisfied with funding systems that rely on peer assessment and, accordingly, have suggested several proposals for reform. One of these proposals is to distribute available funds equally among all qualified researchers, with no interference from peer review. Despite its numerous benefits, such egalitarian sharing faces the objection, among others, that it would lead to an unacceptable dilution of resources. The aim of the present paper is to assess this particular objection. We estimate (for the Netherlands, the U.S. and the U.K.) how much researchers would receive were they to get an equal share of the government budgets that are currently allocated through competitive peer assessment. For the Netherlands, we furthermore estimate what researchers would receive were we to differentiate between researchers working in low-cost, intermediate-cost and high-cost disciplines. Given these estimates, we then determine what researchers could afford in terms of PhD students, Postdocs, travel and equipment. According to our results, researchers could, on average, maintain current PhD student and Postdoc employment levels, and still have at their disposal a moderate (the U.K.) to considerable (the Netherlands, U.S.) budget for travel and equipment. This suggests that the worry that egalitarian sharing leads to unacceptable dilution of resources is unjustified. Indeed, our results strongly suggest that there is room for far more egalitarian distribution of funds than happens in the highly competitive funding schemes so prevalent today.

Suggested Citation

  • Krist Vaesen & Joel Katzav, 2017. "How much would each researcher receive if competitive government research funding were distributed equally among researchers?," PLOS ONE, Public Library of Science, vol. 12(9), pages 1-11, September.
  • Handle: RePEc:plo:pone00:0183967
    DOI: 10.1371/journal.pone.0183967
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0183967
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0183967&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0183967?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jörg Neufeld, 2016. "Determining effects of individual research grants on publication output and impact: The case of the Emmy Noether Programme (German Research Foundation)," Research Evaluation, Oxford University Press, vol. 25(1), pages 50-61.
    2. John P. A. Ioannidis, 2011. "Fund people not projects," Nature, Nature, vol. 477(7366), pages 529-531, September.
    3. Daniel Mietchen, 2014. "The Transformative Nature of Transparency in Research Funding," PLOS Biology, Public Library of Science, vol. 12(12), pages 1-3, December.
    4. Bornmann, Lutz & Leydesdorff, Loet & Van den Besselaar, Peter, 2010. "A meta-evaluation of scientific research proposals: Different ways of comparing rejected to awarded applications," Journal of Informetrics, Elsevier, vol. 4(3), pages 211-220.
    5. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182.
    6. Heidi Ledford, 2016. "US law could increase postdoc pay — and shake up research system," Nature, Nature, vol. 533(7604), pages 450-450, May.
    7. Benda, Wim G.G. & Engels, Tim C.E., 2011. "The predictive validity of peer review: A selective review of the judgmental forecasting qualities of peers, and implications for innovation in science," International Journal of Forecasting, Elsevier, vol. 27(1), pages 166-182, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kyle R. Myers, 2022. "Some Tradeoffs of Competition in Grant Contests," Papers 2207.02379, arXiv.org, revised Mar 2024.
    2. Cian O’Donovan & Aleksandra (Ola) Michalec & Joshua R Moon, 2022. "Capabilities for transdisciplinary research," Research Evaluation, Oxford University Press, vol. 31(1), pages 145-158.
    3. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    4. Peixin Duan, 2022. "How large of a grant size is appropriate? Evidence from the National Natural Science Foundation of China," PLOS ONE, Public Library of Science, vol. 17(2), pages 1-14, February.
    5. Stephan Puehringer, 2023. "Wie viel Wettbewerb wollen wir (uns leisten)? Zur Verwettbewerblichung der Universitaeten in Oesterreich und darueber hinaus," ICAE Working Papers 149, Johannes Kepler University, Institute for Comprehensive Analysis of the Economy.
    6. Ádám Kun, 2018. "Publish and Who Should Perish: You or Science?," Publications, MDPI, vol. 6(2), pages 1-16, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. van den Besselaar, Peter & Sandström, Ulf, 2015. "Early career grants, performance, and careers: A study on predictive validity of grant decisions," Journal of Informetrics, Elsevier, vol. 9(4), pages 826-838.
    2. Frederik T. Verleysen & Tim C.E. Engels, 2013. "A label for peer-reviewed books," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 428-430, February.
    3. Girish Mallapragada & Nandini Lahiri & Atul Nerkar, 2016. "Peer Review and Research Impact," Customer Needs and Solutions, Springer;Institute for Sustainable Innovation and Growth (iSIG), vol. 3(1), pages 29-41, March.
    4. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    5. Wright, George & Rowe, Gene, 2011. "Group-based judgmental forecasting: An integration of extant knowledge and the development of priorities for a new research agenda," International Journal of Forecasting, Elsevier, vol. 27(1), pages 1-13, January.
    6. Jens Jirschitzka & Aileen Oeberst & Richard Göllner & Ulrike Cress, 2017. "Inter-rater reliability and validity of peer reviews in an interdisciplinary field," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1059-1092, November.
    7. Lutz Bornmann, 2012. "The Hawthorne effect in journal peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 857-862, June.
    8. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    9. repec:eee:intfor:v:27:y:2011:i:1:p:1-13 is not listed on IDEAS
    10. Richard R Snell, 2015. "Menage a Quoi? Optimal Number of Peer Reviewers," PLOS ONE, Public Library of Science, vol. 10(4), pages 1-14, April.
    11. Gill, Chelsea & Mehrotra, Vishal & Moses, Olayinka & Bui, Binh, 2023. "The impact of the pitching research framework on AFAANZ grant applications," Pacific-Basin Finance Journal, Elsevier, vol. 77(C).
    12. Girish Mallapragada & Nandini Lahiri & Atul Nerkar, 2016. "Peer Review and Research Impact," Customer Needs and Solutions, Springer;Institute for Sustainable Innovation and Growth (iSIG), vol. 3(1), pages 29-41, March.
    13. Belén Álvarez-Bornstein & Adrián A. Díaz-Faes & María Bordons, 2019. "What characterises funded biomedical research? Evidence from a basic and a clinical domain," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 805-825, May.
    14. Wu, Jiang & Ou, Guiyan & Liu, Xiaohui & Dong, Ke, 2022. "How does academic education background affect top researchers’ performance? Evidence from the field of artificial intelligence," Journal of Informetrics, Elsevier, vol. 16(2).
    15. Mutz, Rüdiger & Daniel, Hans-Dieter, 2018. "The bibliometric quotient (BQ), or how to measure a researcher’s performance capacity: A Bayesian Poisson Rasch model," Journal of Informetrics, Elsevier, vol. 12(4), pages 1282-1295.
    16. Cinzia Daraio & Simone Di Leo & Loet Leydesdorff, 2022. "Using the Leiden Rankings as a Heuristics: Evidence from Italian universities in the European landscape," LEM Papers Series 2022/08, Laboratory of Economics and Management (LEM), Sant'Anna School of Advanced Studies, Pisa, Italy.
    17. Llopis, Oscar & D'Este, Pablo & McKelvey, Maureen & Yegros, Alfredo, 2022. "Navigating multiple logics: Legitimacy and the quest for societal impact in science," Technovation, Elsevier, vol. 110(C).
    18. Frank Rijnsoever & Leon Welle & Sjoerd Bakker, 2014. "Credibility and legitimacy in policy-driven innovation networks: resource dependencies and expectations in Dutch electric vehicle subsidies," The Journal of Technology Transfer, Springer, vol. 39(4), pages 635-661, August.
    19. Janne Pölönen & Otto Auranen, 2022. "Research performance and scholarly communication profile of competitive research funding: the case of Academy of Finland," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7415-7433, December.
    20. Daniel Mietchen, 2014. "The Transformative Nature of Transparency in Research Funding," PLOS Biology, Public Library of Science, vol. 12(12), pages 1-3, December.
    21. Kiri, Bralind & Lacetera, Nicola & Zirulia, Lorenzo, 2018. "Above a swamp: A theory of high-quality scientific production," Research Policy, Elsevier, vol. 47(5), pages 827-839.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0183967. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.