IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i1d10.1007_s11192-023-04873-5.html
   My bibliography  Save this article

What do we know about the disruption index in scientometrics? An overview of the literature

Author

Listed:
  • Christian Leibel

    (Administrative Headquarters of the Max Planck Society
    Ludwig-Maximilians-Universität München)

  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

Abstract

The purpose of this paper is to provide a review of the literature on the original disruption index (DI1) and its variants in scientometrics. The DI1 has received much media attention and prompted a public debate about science policy implications, since a study published in Nature found that papers in all disciplines and patents are becoming less disruptive over time. This review explains in the first part the DI1 and its variants in detail by examining their technical and theoretical properties. The remaining parts of the review are devoted to studies that examine the validity and the limitations of the indices. Particular focus is placed on (1) possible biases that affect disruption indices (2) the convergent and predictive validity of disruption scores, and (3) the comparative performance of the DI1 and its variants. The review shows that, while the literature on convergent validity is not entirely conclusive, it is clear that some modified index variants, in particular DI5, show higher degrees of convergent validity than DI1. The literature draws attention to the fact that (some) disruption indices suffer from inconsistency, time-sensitive biases, and several data-induced biases. The limitations of disruption indices are highlighted and best practice guidelines are provided. The review encourages users of the index to inform about the variety of DI1 variants and to apply the most appropriate variant. More research on the validity of disruption scores as well as a more precise understanding of disruption as a theoretical construct is needed before the indices can be used in the research evaluation practice.

Suggested Citation

  • Christian Leibel & Lutz Bornmann, 2024. "What do we know about the disruption index in scientometrics? An overview of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 601-639, January.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:1:d:10.1007_s11192-023-04873-5
    DOI: 10.1007/s11192-023-04873-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04873-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04873-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Michael Park & Erin Leahey & Russell J. Funk, 2023. "Papers and patents are becoming less disruptive over time," Nature, Nature, vol. 613(7942), pages 138-144, January.
    2. Erin Leahey & Jina Lee & Russell J. Funk, 2023. "What Types of Novelty Are Most Disruptive?," American Sociological Review, , vol. 88(3), pages 562-597, June.
    3. Arts, Sam & Hou, Jianan & Gomez, Juan Carlos, 2021. "Natural language processing to identify the creation and impact of new technologies in patent text: Code, data, and new measures," Research Policy, Elsevier, vol. 50(2).
    4. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    5. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    6. Alcácer, Juan & Gittelman, Michelle & Sampat, Bhaven, 2009. "Applicant and examiner citations in U.S. patents: An overview and analysis," Research Policy, Elsevier, vol. 38(2), pages 415-427, March.
    7. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    8. Nan Deng & An Zeng, 2023. "Enhancing the robustness of the disruption metric against noise," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(4), pages 2419-2428, April.
    9. Jiexun Li & Jiyao Chen, 2022. "Measuring destabilization and consolidation in scientific knowledge evolution," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5819-5839, October.
    10. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    11. Lutz Bornmann & Alexander Tekles, 2019. "Disruptive papers published in Scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 331-336, July.
    12. Tahamtan, Iman & Bornmann, Lutz, 2018. "Core elements in the process of citing publications: Conceptual overview of the literature," Journal of Informetrics, Elsevier, vol. 12(1), pages 203-216.
    13. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    14. Alexander M. Petersen & Felber Arroyave & Fabio Pammolli, 2023. "The disruption index is biased by citation inflation," Papers 2306.01949, arXiv.org.
    15. Chen, Jiyao & Shao, Diana & Fan, Shaokun, 2021. "Destabilization and consolidation: Conceptualizing, measuring, and validating the dual characteristics of technology," Research Policy, Elsevier, vol. 50(1).
    16. Jeffrey T. Macher & Christian Rutzer & Rolf Weder, 2023. "The Illusive Slump of Disruptive Patents," Papers 2306.10774, arXiv.org.
    17. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    18. Lutz Bornmann & Hans-Dieter Daniel, 2009. "Reviewer and editor biases in journal peer review: an investigation of manuscript refereeing at Angewandte Chemie International Edition," Research Evaluation, Oxford University Press, vol. 18(4), pages 262-272, October.
    19. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    20. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    21. Lutz Bornmann & Sitaram Devarakonda & Alexander Tekles & George Chacko, 2020. "Disruptive papers published in Scientometrics: meaningful results by using an improved variant of the disruption index originally proposed by Wu, Wang, and Evans (2019)," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(2), pages 1149-1155, May.
    22. Boris Forthmann & Mark A. Runco, 2020. "An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators," Publications, MDPI, vol. 8(2), pages 1-16, June.
    23. Yuyan Jiang & Xueli Liu, 2023. "A Bibliometric Analysis and Disruptive Innovation Evaluation for the Field of Energy Security," Sustainability, MDPI, vol. 15(2), pages 1-29, January.
    24. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    25. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    26. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    27. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    28. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    29. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    30. Ruan, Xuanmin & Lyu, Dongqing & Gong, Kaile & Cheng, Ying & Li, Jiang, 2021. "Rethinking the disruption index as a measure of scientific and technological advances," Technological Forecasting and Social Change, Elsevier, vol. 172(C).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Christian Leibel & Lutz Bornmann, 2024. "Specification uncertainty: what the disruption index tells us about the (hidden) multiverse of bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(12), pages 7971-7979, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ziyan Zhang & Junyan Zhang & Pushi Wang, 2024. "Measurement of disruptive innovation and its validity based on improved disruption index," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6477-6531, November.
    2. Alex J. Yang & Hongcun Gong & Yuhao Wang & Chao Zhang & Sanhong Deng, 2024. "Rescaling the disruption index reveals the universality of disruption distributions in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 561-580, January.
    3. Christian Leibel & Lutz Bornmann, 2024. "Specification uncertainty: what the disruption index tells us about the (hidden) multiverse of bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(12), pages 7971-7979, December.
    4. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    5. Shiji Chen & Yanan Guo & Alvin Shijie Ding & Yanhui Song, 2024. "Is interdisciplinarity more likely to produce novel or disruptive research?," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(5), pages 2615-2632, May.
    6. Ao, Weiyi & Lyu, Dongqing & Ruan, Xuanmin & Li, Jiang & Cheng, Ying, 2023. "Scientific creativity patterns in scholars’ academic careers: Evidence from PubMed," Journal of Informetrics, Elsevier, vol. 17(4).
    7. Zhang, Ming-Ze & Wang, Tang-Rong & Lyu, Peng-Hui & Chen, Qi-Mei & Li, Ze-Xia & Ngai, Eric W.T., 2024. "Impact of gender composition of academic teams on disruptive output," Journal of Informetrics, Elsevier, vol. 18(2).
    8. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    9. Yi Zhao & Chengzhi Zhang, 2025. "A review on the novelty measurements of academic papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(2), pages 727-753, February.
    10. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    11. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    12. Keye Wu & Ziyue Xie & Jia Tina Du, 2024. "Does science disrupt technology? Examining science intensity, novelty, and recency through patent-paper citations in the pharmaceutical field," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(9), pages 5469-5491, September.
    13. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Dec 2024.
    14. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    15. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    16. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.
    17. Leydesdorff, Loet & Bornmann, Lutz, 2021. "Disruption indices and their calculation using web-of-science data: Indicators of historical developments or evolutionary dynamics?," Journal of Informetrics, Elsevier, vol. 15(4).
    18. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    19. Li, Meiling & Wang, Yang & Du, Haifeng & Bai, Aruhan, 2024. "Motivating innovation: The impact of prestigious talent funding on junior scientists," Research Policy, Elsevier, vol. 53(9).
    20. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:1:d:10.1007_s11192-023-04873-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.