IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v110y2017i1d10.1007_s11192-016-2110-3.html
   My bibliography  Save this article

An efficient system to fund science: from proposal review to peer-to-peer distributions

Author

Listed:
  • Johan Bollen

    (Indiana University
    Indiana University
    Indiana University)

  • David Crandall

    (Indiana University
    Indiana University)

  • Damion Junk

    (Indiana University)

  • Ying Ding

    (Indiana University
    Indiana University)

  • Katy Börner

    (Indiana University
    Indiana University
    Indiana University
    Indiana University)

Abstract

This paper presents a novel model of science funding that exploits the wisdom of the scientific crowd. Each researcher receives an equal, unconditional part of all available science funding on a yearly basis, but is required to individually donate to other scientists a given fraction of all they receive. Science funding thus moves from one scientist to the next in such a way that scientists who receive many donations must also redistribute the most. As the funding circulates through the scientific community it is mathematically expected to converge on a funding distribution favored by the entire scientific community. This is achieved without any proposal submissions or reviews. The model furthermore funds scientists instead of projects, reducing much of the overhead and bias of the present grant peer review system. Model validation using large-scale citation data and funding records over the past 20 years show that the proposed model could yield funding distributions that are similar to those of the NSF and NIH, and the model could potentially be more fair and more equitable. We discuss possible extensions of this approach as well as science policy implications.

Suggested Citation

  • Johan Bollen & David Crandall & Damion Junk & Ying Ding & Katy Börner, 2017. "An efficient system to fund science: from proposal review to peer-to-peer distributions," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 521-528, January.
  • Handle: RePEc:spr:scient:v:110:y:2017:i:1:d:10.1007_s11192-016-2110-3
    DOI: 10.1007/s11192-016-2110-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-2110-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-2110-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom-cited influences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    2. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    3. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2012. "NIH Peer Review: Challenges and Avenues for Reform," NBER Working Papers 18116, National Bureau of Economic Research, Inc.
    4. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2007. "Gender differences in grant peer review: A meta-analysis," Journal of Informetrics, Elsevier, vol. 1(3), pages 226-238.
    5. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom‐cited influences," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    6. Subra Suresh, 2012. "Global challenges need global solutions," Nature, Nature, vol. 490(7420), pages 337-338, October.
    7. Trisha Gura, 2002. "Peer review, unmasked," Nature, Nature, vol. 416(6878), pages 258-260, March.
    8. Natasha Gilbert, 2009. "Wellcome Trust makes it personal in funding revamp," Nature, Nature, vol. 462(7270), pages 145-145, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Blog mentions

    As found by EconAcademics.org, the blog aggregator for Economics research:
    1. What if scientists funded each other?
      by ? in Retraction Watch on 2016-09-20 13:54:00
    2. The Cost of Writing a Project
      by ? in Much Bigger Outside on 2017-06-04 06:00:20
    3. Facilitating peer review with cognitive computing
      by ? in IBM Research on 2017-05-16 12:00:00

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Huang, Ding-wei, 2021. "A basic model for empirical funding distributions," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 566(C).
    2. van Dalen, Hendrik Peter, 2020. "How the Publish-or-Perish Principle Divides a Science : The Case of Academic Economists," Discussion Paper 2020-020, Tilburg University, Center for Economic Research.
    3. Alessandro Pluchino & Alessio Emanuele Biondo & Andrea Rapisarda, 2018. "Talent Versus Luck: The Role Of Randomness In Success And Failure," Advances in Complex Systems (ACS), World Scientific Publishing Co. Pte. Ltd., vol. 21(03n04), pages 1-31, May.
    4. Balázs Győrffy & Andrea Magda Nagy & Péter Herman & Ádám Török, 2018. "Factors influencing the scientific performance of Momentum grant holders: an evaluation of the first 117 research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 409-426, October.
    5. Huang, Ding-wei, 2018. "Optimal distribution of science funding," Physica A: Statistical Mechanics and its Applications, Elsevier, vol. 502(C), pages 613-618.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sten F Odenwald, 2020. "A citation study of earth science projects in citizen science," PLOS ONE, Public Library of Science, vol. 15(7), pages 1-26, July.
    2. Martin Ricker, 2017. "Letter to the Editor: About the quality and impact of scientific articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(3), pages 1851-1855, June.
    3. Yang, Siluo & Han, Ruizhen & Wolfram, Dietmar & Zhao, Yuehua, 2016. "Visualizing the intellectual structure of information science (2006–2015): Introducing author keyword coupling analysis," Journal of Informetrics, Elsevier, vol. 10(1), pages 132-150.
    4. Horbach, Serge & Aagaard, Kaare & Schneider, Jesper W., 2021. "Meta-Research: How problematic citing practices distort science," MetaArXiv aqyhg, Center for Open Science.
    5. Siluo Yang & Feng Ma & Yanhui Song & Junping Qiu, 2010. "A longitudinal analysis of citation distribution breadth for Chinese scholars," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(3), pages 755-765, December.
    6. Hu, Zewen & Wu, Yishan, 2014. "Regularity in the time-dependent distribution of the percentage of never-cited papers: An empirical pilot study based on the six journals," Journal of Informetrics, Elsevier, vol. 8(1), pages 136-146.
    7. Zewen Hu & Yishan Wu & Jianjun Sun, 2018. "A quantitative analysis of determinants of non-citation using a panel data model," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 843-861, August.
    8. Lawrence Smolinsky & Aaron Lercher, 2012. "Citation rates in mathematics: a study of variation by subdiscipline," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(3), pages 911-924, June.
    9. Tobias Opthof & Loet Leydesdorff, 2011. "A comment to the paper by Waltman et al., Scientometrics, 87, 467–481, 2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 1011-1016, September.
    10. Lutz Bornmann & Robin Haunschild, 2017. "Does evaluative scientometrics lose its main focus on scientific quality by the new orientation towards societal impact?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(2), pages 937-943, February.
    11. Park, Hyunwoo & Lee, Jeongsik (Jay) & Kim, Byung-Cheol, 2015. "Project selection in NIH: A natural experiment from ARRA," Research Policy, Elsevier, vol. 44(6), pages 1145-1159.
    12. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    13. Andrey Lovakov & Elena Agadullina, 2019. "Bibliometric analysis of publications from post-Soviet countries in psychological journals in 1992–2017," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1157-1171, May.
    14. Jianhua Hou & Jiantao Ye, 2020. "Are uncited papers necessarily all nonimpact papers? A quantitative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1631-1662, August.
    15. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    16. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    17. Thelwall, Mike, 2017. "Three practical field normalised alternative indicator formulae for research evaluation," Journal of Informetrics, Elsevier, vol. 11(1), pages 128-151.
    18. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    19. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    20. Michael Calver & Kate Bryant & Grant Wardell-Johnson, 2018. "Quantifying the internationality and multidisciplinarity of authors and journals using ecological statistics," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 731-748, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:110:y:2017:i:1:d:10.1007_s11192-016-2110-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.