IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v18y2024i1s1751157723001050.html
   My bibliography  Save this article

Unveiling the impact and dual innovation of funded research

Author

Listed:
  • Yang, Alex J.

Abstract

In the relentless pursuit of scientific advancement, comprehending the profound impact and innovation nature inherent in funded research projects assumes paramount significance. To illuminate this matter, I delve into the realm of research supported by the National Institutes of Health (NIH) and the National Science Foundation (NSF). The evaluative framework encompasses a spectrum of metrics, including citations by papers, patents, and Tweets, as markers of research impact. Moreover, I embrace ex-ante innovation (Novelty) and ex-post innovation (Disruption) as dual indispensable yardsticks for evaluating the innovative nature of research projects. Novelty denotes the manifestation of atypical combinations of existing knowledge, while Disruption signifies the extent of paradigm-shifting potential and the ability to exert a disruptive influence on future research endeavors. First, the analysis reveals that funded research projects manifest a conspicuously heightened impact in comparison to their non-funded counterparts. Second, I uncover a noteworthy finding: funded research demonstrates significantly higher levels of ex-ante innovation (Novelty). However, in a surprising twist, the impact of funding on ex-post innovation (Disruption) appears to be faint. Additionally, I undertake a meticulous scrutiny of the robustness of the research findings by scrutinizing patterns across years and fields. Despite the uneven distribution of NIH and NSF funded research and inconspicuous heterogeneity across fields, the patterns of the impact and dual innovation of funded research are consistent across almost all fields.

Suggested Citation

  • Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
  • Handle: RePEc:eee:infome:v:18:y:2024:i:1:s1751157723001050
    DOI: 10.1016/j.joi.2023.101480
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157723001050
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2023.101480?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Joshua D. Angrist & Jörn-Steffen Pischke, 2010. "The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics," Journal of Economic Perspectives, American Economic Association, vol. 24(2), pages 3-30, Spring.
    2. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    3. Jian Gao & Yi-Cheng Zhang & Tao Zhou, 2019. "Computational Socioeconomics," Papers 1905.06166, arXiv.org.
    4. Park, Hyunwoo & Lee, Jeongsik (Jay) & Kim, Byung-Cheol, 2015. "Project selection in NIH: A natural experiment from ARRA," Research Policy, Elsevier, vol. 44(6), pages 1145-1159.
    5. James G. March, 1991. "Exploration and Exploitation in Organizational Learning," Organization Science, INFORMS, vol. 2(1), pages 71-87, February.
    6. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    7. Ekaterina Galkina Cleary & Jennifer M. Beierlein & Navleen Surjit Khanuja & Laura M. McNamee & Fred D. Ledley, 2018. "Contribution of NIH funding to new drug approvals 2010–2016," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 115(10), pages 2329-2334, March.
    8. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    9. Mikko Packalen & Jay Bhattacharya, 2020. "NIH funding and the pursuit of edge science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 117(22), pages 12011-12016, June.
    10. Pierre Azoulay & Joshua S Graff Zivin & Danielle Li & Bhaven N Sampat, 2019. "Public R&D Investments and Private-sector Patenting: Evidence from NIH Funding Rules," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 86(1), pages 117-152.
    11. Michael Park & Erin Leahey & Russell J. Funk, 2023. "Papers and patents are becoming less disruptive over time," Nature, Nature, vol. 613(7942), pages 138-144, January.
    12. Pierre Azoulay, 2012. "Turn the scientific method on ourselves," Nature, Nature, vol. 484(7392), pages 31-32, April.
    13. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    14. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    15. Loet Leydesdorff & Lutz Bornmann & Caroline S. Wagner, 2019. "The Relative Influences of Government Funding and International Collaboration on Citation Impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(2), pages 198-201, February.
    16. Loet Leydesdorff, 2018. "Diversity and interdisciplinarity: how can one distinguish and recombine disparity, variety, and balance?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2113-2121, September.
    17. Yian Yin & Yang Wang & James A. Evans & Dashun Wang, 2019. "Quantifying the dynamics of failure across science, startups and security," Nature, Nature, vol. 575(7781), pages 190-194, November.
    18. Trapido, Denis, 2015. "How novelty in knowledge earns recognition: The role of consistent identities," Research Policy, Elsevier, vol. 44(8), pages 1488-1500.
    19. Feliciani, Thomas & Morreau, Michael & Luo, Junwen & Lucas, Pablo & Shankar, Kalpana, 2022. "Designing grant-review panels for better funding decisions: Lessons from an empirically calibrated simulation model," Research Policy, Elsevier, vol. 51(4).
    20. Wagner, Caroline S. & Whetsell, Travis A. & Mukherjee, Satyam, 2019. "International research collaboration: Novelty, conventionality, and atypicality in knowledge recombination," Research Policy, Elsevier, vol. 48(5), pages 1260-1270.
    21. Andy Stirling, 2007. "A General Framework for Analysing Diversity in Science, Technology and Society," SPRU Working Paper Series 156, SPRU - Science Policy Research Unit, University of Sussex Business School.
    22. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    23. Feng Shi & James Evans, 2023. "Surprising combinations of research contents and contexts are related to impact and emerge with scientific outsiders from distant disciplines," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    24. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    25. Matt Marx & Aaron Fuegi, 2020. "Reliance on science: Worldwide front‐page patent citations to scientific articles," Strategic Management Journal, Wiley Blackwell, vol. 41(9), pages 1572-1594, September.
    26. Danielle Li, 2017. "Expertise versus Bias in Evaluation: Evidence from the NIH," American Economic Journal: Applied Economics, American Economic Association, vol. 9(2), pages 60-92, April.
    27. Paula Stephan & Reinhilde Veugelers & Jian Wang, 2017. "Reviewers are blinkered by bibliometrics," Nature, Nature, vol. 544(7651), pages 411-412, April.
    28. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    29. Johan S. G. Chu & James A. Evans, 2021. "Slowed canonical progress in large fields of science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 118(41), pages 2021636118-, October.
    30. Pierre Azoulay & Erica Fuchs & Anna P. Goldstein & Michael Kearney, 2018. "Funding Breakthrough Research: Promises and Challenges of the "ARPA Model"," NBER Chapters, in: Innovation Policy and the Economy, Volume 19, pages 69-96, National Bureau of Economic Research, Inc.
    31. Matt Marx & Aaron Fuegi, 2022. "Reliance on science by inventors: Hybrid extraction of in‐text patent‐to‐article citations," Journal of Economics & Management Strategy, Wiley Blackwell, vol. 31(2), pages 369-392, April.
    32. Shor, Boris & Bafumi, Joseph & Keele, Luke & Park, David, 2007. "A Bayesian Multilevel Modeling Approach to Time-Series Cross-Sectional Data," Political Analysis, Cambridge University Press, vol. 15(2), pages 165-181, April.
    33. Fengli Xu & Lingfei Wu & James Evans, 2022. "Flat teams drive scientific innovation," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 119(23), pages 2200927119-, June.
    34. Fontana, Magda & Iori, Martina & Montobbio, Fabio & Sinatra, Roberta, 2020. "New and atypical combinations: An assessment of novelty and interdisciplinarity," Research Policy, Elsevier, vol. 49(7).
    35. Bas Hofstra & Vivek V. Kulkarni & Sebastian Munoz-Najar Galvez & Bryan He & Dan Jurafsky & Daniel A. McFarland, 2020. "The Diversity–Innovation Paradox in Science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 117(17), pages 9284-9291, April.
    36. Leydesdorff, Loet & Wagner, Caroline S. & Bornmann, Lutz, 2019. "Interdisciplinarity as diversity in citation patterns among journals: Rao-Stirling diversity, relative variety, and the Gini coefficient," Journal of Informetrics, Elsevier, vol. 13(1), pages 255-269.
    37. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    38. Pierre Azoulay & Erica Fuchs & Anna Goldstein & Michael Kearney, 2018. "Funding Breakthrough Research: Promises and Challenges of the “ARPA Model”," NBER Working Papers 24674, National Bureau of Economic Research, Inc.
    39. Chen, Jiyao & Shao, Diana & Fan, Shaokun, 2021. "Destabilization and consolidation: Conceptualizing, measuring, and validating the dual characteristics of technology," Research Policy, Elsevier, vol. 50(1).
    40. Yian Yin & Yuxiao Dong & Kuansan Wang & Dashun Wang & Benjamin F. Jones, 2022. "Public use and public funding of science," Nature Human Behaviour, Nature, vol. 6(10), pages 1344-1350, October.
    41. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    2. Yang, Alex Jie & Wu, Linwei & Zhang, Qi & Wang, Hao & Deng, Sanhong, 2023. "The k-step h-index in citation networks at the paper, author, and institution levels," Journal of Informetrics, Elsevier, vol. 17(4).
    3. Kwon, Seokbeom, 2022. "Interdisciplinary knowledge integration as a unique knowledge source for technology development and the role of funding allocation," Technological Forecasting and Social Change, Elsevier, vol. 181(C).
    4. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Nov 2023.
    5. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    6. Zhang, Yang & Wang, Yang & Du, Haifeng & Havlin, Shlomo, 2024. "Delayed citation impact of interdisciplinary research," Journal of Informetrics, Elsevier, vol. 18(1).
    7. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    8. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    9. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    10. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    11. Shiji Chen & Yanhui Song & Fei Shu & Vincent Larivière, 2022. "Interdisciplinarity and impact: the effects of the citation time window," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2621-2642, May.
    12. Wang, Cheng-Jun & Yan, Lihan & Cui, Haochuan, 2023. "Unpacking the essential tension of knowledge recombination: Analyzing the impact of knowledge spanning on citation impact and disruptive innovation," Journal of Informetrics, Elsevier, vol. 17(4).
    13. Xiaojing Cai & Xiaozan Lyu & Ping Zhou, 2023. "The relationship between interdisciplinarity and citation impact—a novel perspective on citation accumulation," Palgrave Communications, Palgrave Macmillan, vol. 10(1), pages 1-12, December.
    14. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    15. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    16. Zhongyi Wang & Keying Wang & Jiyue Liu & Jing Huang & Haihua Chen, 2022. "Measuring the innovation of method knowledge elements in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2803-2827, May.
    17. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    18. Elizabeth S. Vieira, 2023. "The influence of research collaboration on citation impact: the countries in the European Innovation Scoreboard," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3555-3579, June.
    19. Kyle Myers & Wei Yang Tham, 2023. "Money, Time, and Grant Design," Papers 2312.06479, arXiv.org.
    20. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:18:y:2024:i:1:s1751157723001050. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.