IDEAS home Printed from https://ideas.repec.org/a/oup/scippl/v49y2022i3p365-377..html
   My bibliography  Save this article

Research funding randomly allocated? A survey of scientists’ views on peer review and lottery

Author

Listed:
  • Axel Philipps

Abstract

The bold idea of random grant allocation is heatedly discussed as an alternative to peer review. The debate centers on advantages and disadvantages of the established measures to control scientific quality, compared to funding by chance. Recently, studies also investigated acceptance of lotteries in the scientific field. However, they provide only inconclusive findings due to their restricted scope. This paper examines scientists’ views on current funding conditions and the idea of random grant distribution. An online survey of PhD holders reveals that most participants are against pure randomness, although they would try random elements if such procedures were combined with peer review. Moreover, while fewer established and recognized scientists differ in their assessments of peer review and expectancies on lotteries’ impact, they hardly vary in their positions on random elements. Funding organizations therefore should be encouraged to further experiment with, and closely examine, practiced lotteries.

Suggested Citation

  • Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
  • Handle: RePEc:oup:scippl:v:49:y:2022:i:3:p:365-377.
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1093/scipol/scab084
    Download Restriction: Access to full text is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Axel Philipps, 2021. "Science rules! A qualitative study of scientists’ approaches to grant lottery [The Secret to Germany’s Scientific Excellence]," Research Evaluation, Oxford University Press, vol. 30(1), pages 102-111.
    2. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    3. David Adam, 2019. "Science funders gamble on grant lotteries," Nature, Nature, vol. 575(7784), pages 574-575, November.
    4. Heinze, Thomas & Shapira, Philip & Rogers, Juan D. & Senker, Jacqueline M., 2009. "Organizational and institutional influences on creativity in scientific research," Research Policy, Elsevier, vol. 38(4), pages 610-623, May.
    5. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    6. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    7. Elise S Brezis, 2007. "Focal randomisation: An optimal mechanism for the evaluation of R&D projects," Science and Public Policy, Oxford University Press, vol. 34(10), pages 691-698, December.
    8. John P. A. Ioannidis, 2011. "Fund people not projects," Nature, Nature, vol. 477(7366), pages 529-531, September.
    9. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    10. Musselin, Christine, 2013. "How peer review empowers the academic profession and university managers: Changes in relationships between the state, universities and the professoriate," Research Policy, Elsevier, vol. 42(5), pages 1165-1173.
    11. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees' decisions," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 297-320, April.
    12. Osterloh, Margit & Frey, Bruno S., 2019. "Dealing With Randomness," management revue - Socio-Economic Studies, Nomos Verlagsgesellschaft mbH & Co. KG, vol. 30(4), pages 331-345.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    2. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.
    3. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    4. Elise S. Brezis & Aliaksandr Birukou, 2020. "Arbitrariness in the peer review process," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 393-411, April.
    5. Gregoire Mariethoz & Frédéric Herman & Amelie Dreiss, 2021. "The imaginary carrot: no correlation between raising funds and research productivity in geosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(3), pages 2401-2407, March.
    6. Albert Banal-Estañol & Qianshuo Liu & Inés Macho-Stadler & David Pérez-Castrillo, 2021. "Similar-to-me Effects in the Grant Application Process: Applicants, Panelists, and the Likelihood of Obtaining Funds," Working Papers 1289, Barcelona School of Economics.
    7. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    8. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Working Papers 26889, National Bureau of Economic Research, Inc.
    9. Chiara Franzoni & Paula Stephan & Reinhilde Veugelers, 2022. "Funding Risky Research," Entrepreneurship and Innovation Policy and the Economy, University of Chicago Press, vol. 1(1), pages 103-133.
    10. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    11. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    12. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    13. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    14. Marco Ottaviani, 2020. "Grantmaking," Working Papers 672, IGIER (Innocenzo Gasparini Institute for Economic Research), Bocconi University.
    15. Balietti, Stefano & Riedl, Christoph, 2021. "Incentives, competition, and inequality in markets for creative production," Research Policy, Elsevier, vol. 50(4).
    16. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    17. Stephen Gallo & Lisa Thompson & Karen Schmaling & Scott Glisson, 2018. "Risk evaluation in peer review of grant applications," Environment Systems and Decisions, Springer, vol. 38(2), pages 216-229, June.
    18. Pierre Azoulay & Danielle Li, 2020. "Scientific Grant Funding," NBER Chapters, in: Innovation and Public Policy, pages 117-150, National Bureau of Economic Research, Inc.
    19. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    20. Miguel Navascués & Costantino Budroni, 2019. "Theoretical research without projects," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-35, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:oup:scippl:v:49:y:2022:i:3:p:365-377.. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Oxford University Press (email available below). General contact details of provider: https://academic.oup.com/spp .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.