IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i7d10.1007_s11192-024-05071-7.html
   My bibliography  Save this article

A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions

Author

Listed:
  • Yuefen Wang

    (Tianjin Normal University)

  • Lipeng Fan

    (Tianjin Normal University)

  • Lei Wu

    (Shandong Normal University)

Abstract

Exploring a robust and universal appeal bibliometric indicator for assessing creativity is essential but challenging. The novelty measure of innovation proposed by Uzzi et al. (NoveltyU) has sparked considerable interest and debate. Thus, further validation and understanding of its portfolio form of novelty and scope of application are necessary. This paper delves into the calculation and application of the NoveltyU method to shed light on its effectiveness and scope. Analysis of the calculation process reveals that journal pairs with higher novelty often span independent fundamental areas, while those with lower novelty tend to focus on similar and applied fields. Utilizing collaboration patterns between institutions, as discussed in our prior study (Fan et al., Scientometrics 125:1179–1196, 2020), offers insight into the method’s performance in real-world contexts. Results consistently show higher mean NoveltyU values in MM pattern over time, affirming the method’s validity. Categorizing papers into high conventional, low conventional, low novel, and high novel categories unveils higher overlap degree of terms among different patterns in high novel papers. Moreover, leading terms in MM pattern exhibit specific information, while delay terms tend to be more general, and simultaneous terms are even more so. These findings offer valuable insights into identifying hot and frontier topics, bolstering the credibility and application potential of the NoveltyU method, aligning with the broader objective of establishing valid measures of innovativeness in research.

Suggested Citation

  • Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:7:d:10.1007_s11192-024-05071-7
    DOI: 10.1007/s11192-024-05071-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-024-05071-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-024-05071-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    2. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    3. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    4. Lipeng Fan & Yuefen Wang & Shengchun Ding & Binbin Qi, 2020. "Productivity trends and citation impact of different institutional collaboration patterns at the research units’ level," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1179-1196, November.
    5. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    6. Michael Park & Erin Leahey & Russell J. Funk, 2023. "Papers and patents are becoming less disruptive over time," Nature, Nature, vol. 613(7942), pages 138-144, January.
    7. Sun, Bixuan & Kolesnikov, Sergey & Goldstein, Anna & Chan, Gabriel, 2021. "A dynamic approach for identifying technological breakthroughs with an application in solar photovoltaics," Technological Forecasting and Social Change, Elsevier, vol. 165(C).
    8. Gault, Fred, 2018. "Defining and measuring innovation in all sectors of the economy," Research Policy, Elsevier, vol. 47(3), pages 617-622.
    9. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    10. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    11. Arts, Sam & Hou, Jianan & Gomez, Juan Carlos, 2021. "Natural language processing to identify the creation and impact of new technologies in patent text: Code, data, and new measures," Research Policy, Elsevier, vol. 50(2).
    12. Gao, Qiang & Liang, Zhentao & Wang, Ping & Hou, Jingrui & Chen, Xiuxiu & Liu, Manman, 2021. "Potential index: Revealing the future impact of research topics based on current knowledge networks," Journal of Informetrics, Elsevier, vol. 15(3).
    13. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    14. Wagner, Caroline S. & Whetsell, Travis A. & Mukherjee, Satyam, 2019. "International research collaboration: Novelty, conventionality, and atypicality in knowledge recombination," Research Policy, Elsevier, vol. 48(5), pages 1260-1270.
    15. Chao Min & Ying Ding & Jiang Li & Yi Bu & Lei Pei & Jianjun Sun, 2018. "Innovation or imitation: The diffusion of citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(10), pages 1271-1282, October.
    16. Fontana, Magda & Iori, Martina & Montobbio, Fabio & Sinatra, Roberta, 2020. "New and atypical combinations: An assessment of novelty and interdisciplinarity," Research Policy, Elsevier, vol. 49(7).
    17. Min, Chao & Bu, Yi & Sun, Jianjun, 2021. "Predicting scientific breakthroughs based on knowledge structure variations," Technological Forecasting and Social Change, Elsevier, vol. 164(C).
    18. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    19. Chen, Jiyao & Shao, Diana & Fan, Shaokun, 2021. "Destabilization and consolidation: Conceptualizing, measuring, and validating the dual characteristics of technology," Research Policy, Elsevier, vol. 50(1).
    20. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    2. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    3. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    4. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    5. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    6. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    7. Keye Wu & Ziyue Xie & Jia Tina Du, 2024. "Does science disrupt technology? Examining science intensity, novelty, and recency through patent-paper citations in the pharmaceutical field," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5469-5491, September.
    8. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    9. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.
    10. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    11. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    12. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    13. Wang, Cheng-Jun & Yan, Lihan & Cui, Haochuan, 2023. "Unpacking the essential tension of knowledge recombination: Analyzing the impact of knowledge spanning on citation impact and disruptive innovation," Journal of Informetrics, Elsevier, vol. 17(4).
    14. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    15. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    16. Jeon, Daeseong & Lee, Junyoup & Ahn, Joon Mo & Lee, Changyong, 2023. "Measuring the novelty of scientific publications: A fastText and local outlier factor approach," Journal of Informetrics, Elsevier, vol. 17(4).
    17. Houqiang Yu & Yian Liang & Yinghua Xie, 2024. "Predicting Scientific Breakthroughs Based on Structural Dynamic of Citation Cascades," Mathematics, MDPI, vol. 12(11), pages 1-18, June.
    18. Giulio Giacomo Cantone, 2024. "How to measure interdisciplinary research? A systemic design for the model of measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4937-4982, August.
    19. Tong, Tong & Wang, Wanru & Ye, Fred Y., 2024. "A complement to the novel disruption indicator based on knowledge entities," Journal of Informetrics, Elsevier, vol. 18(2).
    20. Guo, Liying & Wang, Yang & Li, Meiling, 2024. "Exploration, exploitation and funding success: Evidence from junior scientists supported by the Chinese Young Scientists Fund," Journal of Informetrics, Elsevier, vol. 18(2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:7:d:10.1007_s11192-024-05071-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.