IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i5d10.1007_s11192-024-04968-7.html
   My bibliography  Save this article

Do grant proposal texts matter for funding decisions? A field experiment

Author

Listed:
  • Müge Simsek

    (University of Amsterdam)

  • Mathijs Vaan

    (University of California, Berkeley)

  • Arnout Rijt

    (European University Institute
    Utrecht University)

Abstract

Scientists and funding agencies invest considerable resources in writing and evaluating grant proposals. But do grant proposal texts noticeably change panel decisions in single blind review? We report on a field experiment conducted by The Dutch Research Council (NWO) in collaboration with the authors in an early-career competition for awards of 800,000 euros of research funding. A random half of panelists were shown a CV and only a one-paragraph summary of the proposed research, while the other half were shown a CV and a full proposal. We find that withholding proposal texts from panelists did not detectibly impact their proposal rankings. This result suggests that the resources devoted to writing and evaluating grant proposals may not have their intended effect of facilitating the selection of the most promising science.

Suggested Citation

  • Müge Simsek & Mathijs Vaan & Arnout Rijt, 2024. "Do grant proposal texts matter for funding decisions? A field experiment," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(5), pages 2521-2532, May.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:5:d:10.1007_s11192-024-04968-7
    DOI: 10.1007/s11192-024-04968-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-04968-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-04968-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Johan Bollen & Herbert Van de Sompel & Aric Hagberg & Ryan Chute, 2009. "A Principal Component Analysis of 39 Scientific Impact Measures," PLOS ONE, Public Library of Science, vol. 4(6), pages 1-11, June.
    2. Jacob, Brian A. & Lefgren, Lars, 2011. "The impact of research grant funding on scientific productivity," Journal of Public Economics, Elsevier, vol. 95(9), pages 1168-1177.
    3. David M. Waguespack & Olav Sorenson, 2011. "The Ratings Game: Asymmetry in Classification," Organization Science, INFORMS, vol. 22(3), pages 541-553, June.
    4. John P. A. Ioannidis, 2011. "Fund people not projects," Nature, Nature, vol. 477(7366), pages 529-531, September.
    5. Upali W. Jayasinghe & Herbert W. Marsh & Nigel Bond, 2003. "A multilevel cross‐classified modelling approach to peer review of grant proposals: the effects of assessor and researcher attributes on assessor ratings," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 166(3), pages 279-300, October.
    6. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    7. Kevin Gross & Carl T Bergstrom, 2019. "Contest models highlight inherent inefficiencies of scientific funding competitions," PLOS Biology, Public Library of Science, vol. 17(1), pages 1-15, January.
    8. David Adam, 2019. "Science funders gamble on grant lotteries," Nature, Nature, vol. 575(7784), pages 574-575, November.
    9. Yang Wang & Benjamin F. Jones & Dashun Wang, 2019. "Early-career setback and future career impact," Nature Communications, Nature, vol. 10(1), pages 1-10, December.
    10. Rüdiger Mutz & Lutz Bornmann & Hans-Dieter Daniel, 2012. "Heterogeneity of Inter-Rater Reliabilities of Grant Peer Reviews and Its Determinants: A General Estimating Equations Approach," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-10, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Klebel, Thomas & Traag, Vincent, 2025. "Introduction to structural causal models in science studies," SocArXiv 4bw9e_v3, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gregoire Mariethoz & Frédéric Herman & Amelie Dreiss, 2021. "The imaginary carrot: no correlation between raising funds and research productivity in geosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(3), pages 2401-2407, March.
    2. Kyle R. Myers, 2022. "Some Tradeoffs of Competition in Grant Contests," Papers 2207.02379, arXiv.org, revised Mar 2024.
    3. Axel Philipps, 2022. "Research funding randomly allocated? A survey of scientists’ views on peer review and lottery," Science and Public Policy, Oxford University Press, vol. 49(3), pages 365-377.
    4. Li, Meiling & Wang, Yang & Du, Haifeng & Bai, Aruhan, 2024. "Motivating innovation: The impact of prestigious talent funding on junior scientists," Research Policy, Elsevier, vol. 53(9).
    5. Elena A. Erosheva & Patrícia Martinková & Carole J. Lee, 2021. "When zero may not be zero: A cautionary note on the use of inter‐rater reliability in evaluating grant peer review," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(3), pages 904-919, July.
    6. Lawson, Cornelia & Salter, Ammon, 2023. "Exploring the effect of overlapping institutional applications on panel decision-making," Research Policy, Elsevier, vol. 52(9).
    7. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    8. Kok, Holmer & Faems, Dries & de Faria, Pedro, 2022. "Pork Barrel or Barrel of Gold? Examining the performance implications of earmarking in public R&D grants," Research Policy, Elsevier, vol. 51(7).
    9. Marielle Non & Jeroen van Honk & Vince van Houten & Inge van der Weijden & Thed van Leeuwen, 2022. "Getting off to a flying start? The effects of an early-career international mobility grant on scientific performance," CPB Discussion Paper 443, CPB Netherlands Bureau for Economic Policy Analysis.
    10. Ayoubi, Charles & Pezzoni, Michele & Visentin, Fabiana, 2019. "The important thing is not to win, it is to take part: What if scientists benefit from participating in research grant competitions?," Research Policy, Elsevier, vol. 48(1), pages 84-97.
    11. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2017. "The Important Thing is not to Win, it is to Take Part: What If Scientists Benefit from Participating in Competitive Grant Races?," GREDEG Working Papers 2017-27, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    12. Zohreh Zahedi & Rodrigo Costas & Paul Wouters, 2014. "How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1491-1513, November.
    13. Miguel Navascués & Costantino Budroni, 2019. "Theoretical research without projects," PLOS ONE, Public Library of Science, vol. 14(3), pages 1-35, March.
    14. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    15. David Gurwitz & Elena Milanesi & Thomas Koenig, 2014. "Grant Application Review: The Case of Transparency," PLOS Biology, Public Library of Science, vol. 12(12), pages 1-6, December.
    16. Ghirelli, Corinna & Havari, Enkelejda & Meroni, Elena Claudia & Verzillo, Stefano, 2023. "The Long-Term Causal Effects of Winning an ERC Grant," IZA Discussion Papers 16108, Institute of Labor Economics (IZA).
    17. Mutz, Rüdiger & Daniel, Hans-Dieter, 2018. "The bibliometric quotient (BQ), or how to measure a researcher’s performance capacity: A Bayesian Poisson Rasch model," Journal of Informetrics, Elsevier, vol. 12(4), pages 1282-1295.
    18. Patrícia Martinková & Dan Goldhaber & Elena Erosheva, 2018. "Disparities in ratings of internal and external applicants: A case for model-based inter-rater reliability," PLOS ONE, Public Library of Science, vol. 13(10), pages 1-17, October.
    19. David G Pina & Darko Hren & Ana Marušić, 2015. "Peer Review Evaluation Process of Marie Curie Actions under EU’s Seventh Framework Programme for Research," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    20. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:5:d:10.1007_s11192-024-04968-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.