IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v18y2024i4s1751157724000993.html
   My bibliography  Save this article

An effective framework for measuring the novelty of scientific articles through integrated topic modeling and cloud model

Author

Listed:
  • Wang, Zhongyi
  • Zhang, Haoxuan
  • Chen, Jiangping
  • Chen, Haihua

Abstract

Novelty is a critical characteristic of innovative scientific articles, and accurately identifying novelty can facilitate the early detection of scientific breakthroughs. However, existing methods for measuring novelty have two main limitations: (1) Metadata-based approaches, such as citation analysis, are retrospective and do not alleviate the pressures of the peer review process or enable timely tracking of scientific progress; (2) Content-based methods have not adequately addressed the inherent uncertainty between the qualitative concept of novelty and the textual representation of papers. To address these issues, we propose a practical and effective framework for measuring the novelty of scientific articles through integrated topic modeling and cloud model, referred to as MNSA-ITMCM. In this framework, papers are represented as topic combinations, and novelty is reflected in the organic reorganization of these topics. We use the BERTopic model to generate semantically informed topics, and then apply a topic selection algorithm based on maximum marginal relevance to obtain a topic combination that balances similarity and diversity. Furthermore, we leverage the cloud model from fuzzy mathematics to quantify novelty, overcoming the uncertainty inherent in natural language expression and topic modeling to improve the accuracy of novelty measurement. To validate the effectiveness of our framework, we conducted empirical evaluations on papers from the Cell 2021 journal (biomedical domain) and the ICLR 2023 conference (computer science domain). Through correlation analysis and prediction error analysis, our framework demonstrated the ability to identify different types of novel papers and accurately predict their novelty levels. The proposed framework is applicable across diverse scientific disciplines and publication venues, benefiting researchers, librarians, science evaluation agencies, policymakers, and funding organizations by improving the efficiency and comprehensiveness of identifying novelty research.

Suggested Citation

  • Wang, Zhongyi & Zhang, Haoxuan & Chen, Jiangping & Chen, Haihua, 2024. "An effective framework for measuring the novelty of scientific articles through integrated topic modeling and cloud model," Journal of Informetrics, Elsevier, vol. 18(4).
  • Handle: RePEc:eee:infome:v:18:y:2024:i:4:s1751157724000993
    DOI: 10.1016/j.joi.2024.101587
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157724000993
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2024.101587?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Kuniko Matsumoto & Sotaro Shibayama & Byeongwoo Kang & Masatsura Igami, 2021. "Introducing a novelty indicator for scientific research: validating the knowledge-based combinatorial approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6891-6915, August.
    2. Yan Yan & Shanwu Tian & Jingjing Zhang, 2020. "The impact of a paper’s new combinations and new components on its citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 895-913, February.
    3. Serge P J M Horbach & Freek J W Oude Maatman & Willem Halffman & Wytske M Hepkema, 2022. "Automated citation recommendation tools encourage questionable citations," Research Evaluation, Oxford University Press, vol. 31(3), pages 321-325.
    4. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    5. Xiaoying Li & Suyuan Peng & Jian Du, 2021. "Towards medical knowmetrics: representing and computing medical knowledge using semantic predications as the knowledge unit and the uncertainty as the knowledge context," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 6225-6251, July.
    6. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    7. Arts, Sam & Hou, Jianan & Gomez, Juan Carlos, 2021. "Natural language processing to identify the creation and impact of new technologies in patent text: Code, data, and new measures," Research Policy, Elsevier, vol. 50(2).
    8. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    9. Mingxin Yao & Ying Wei & Huiyu Wang, 2023. "Promoting research by reducing uncertainty in academic writing: a large-scale diachronic case study on hedging in Science research articles across 25 years," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4541-4558, August.
    10. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    11. Janine Huisman & Jeroen Smits, 2017. "Duration and quality of the peer review process: the author’s perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(1), pages 633-650, October.
    12. Manuel Trajtenberg & Rebecca Henderson & Adam Jaffe, 1997. "University Versus Corporate Patents: A Window On The Basicness Of Invention," Economics of Innovation and New Technology, Taylor & Francis Journals, vol. 5(1), pages 19-50.
    13. Yogesh K. Dwivedi & A. Sharma & Nripendra P. Rana & M. Giannakis & P. Goel & Vincent Dutot, 2023. "Evolution of Artificial Intelligence Research in Technological Forecasting and Social Change: Research Topics, Trends, and Future Directions," Post-Print hal-04292607, HAL.
    14. Fontana, Magda & Iori, Martina & Montobbio, Fabio & Sinatra, Roberta, 2020. "New and atypical combinations: An assessment of novelty and interdisciplinarity," Research Policy, Elsevier, vol. 49(7).
    15. Min, Chao & Bu, Yi & Sun, Jianjun, 2021. "Predicting scientific breakthroughs based on knowledge structure variations," Technological Forecasting and Social Change, Elsevier, vol. 164(C).
    16. Bas Hofstra & Vivek V. Kulkarni & Sebastian Munoz-Najar Galvez & Bryan He & Dan Jurafsky & Daniel A. McFarland, 2020. "The Diversity–Innovation Paradox in Science," Proceedings of the National Academy of Sciences, Proceedings of the National Academy of Sciences, vol. 117(17), pages 9284-9291, April.
    17. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2020. "A novel methodology to assess the scientific standing of nations at field level," Journal of Informetrics, Elsevier, vol. 14(1).
    18. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    19. Xu, Shuo & Hao, Liyuan & Yang, Guancan & Lu, Kun & An, Xin, 2021. "A topic models based framework for detecting and forecasting emerging technologies," Technological Forecasting and Social Change, Elsevier, vol. 162(C).
    20. Verhoeven, Dennis & Bakker, Jurriën & Veugelers, Reinhilde, 2016. "Measuring technological novelty with patent-based indicators," Research Policy, Elsevier, vol. 45(3), pages 707-723.
    21. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    22. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    23. Christian Leibel & Lutz Bornmann, 2024. "What do we know about the disruption index in scientometrics? An overview of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 601-639, January.
    24. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    25. Jeon, Daeseong & Lee, Junyoup & Ahn, Joon Mo & Lee, Changyong, 2023. "Measuring the novelty of scientific publications: A fastText and local outlier factor approach," Journal of Informetrics, Elsevier, vol. 17(4).
    26. Dwivedi, Yogesh K. & Sharma, Anuj & Rana, Nripendra P. & Giannakis, Mihalis & Goel, Pooja & Dutot, Vincent, 2023. "Evolution of artificial intelligence research in Technological Forecasting and Social Change: Research topics, trends, and future directions," Technological Forecasting and Social Change, Elsevier, vol. 192(C).
    27. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    28. Zhongyi Wang & Keying Wang & Jiyue Liu & Jing Huang & Haihua Chen, 2022. "Measuring the innovation of method knowledge elements in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2803-2827, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yi Zhao & Chengzhi Zhang, 2025. "A review on the novelty measurements of academic papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(2), pages 727-753, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    2. Yi Zhao & Chengzhi Zhang, 2025. "A review on the novelty measurements of academic papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(2), pages 727-753, February.
    3. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    4. Christian Leibel & Lutz Bornmann, 2024. "What do we know about the disruption index in scientometrics? An overview of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 601-639, January.
    5. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Dec 2024.
    6. Shiji Chen & Yanan Guo & Alvin Shijie Ding & Yanhui Song, 2024. "Is interdisciplinarity more likely to produce novel or disruptive research?," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(5), pages 2615-2632, May.
    7. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    8. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    9. Jeon, Daeseong & Lee, Junyoup & Ahn, Joon Mo & Lee, Changyong, 2023. "Measuring the novelty of scientific publications: A fastText and local outlier factor approach," Journal of Informetrics, Elsevier, vol. 17(4).
    10. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    11. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    12. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    13. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    14. Chen, Jiyao & Shao, Diana & Fan, Shaokun, 2021. "Destabilization and consolidation: Conceptualizing, measuring, and validating the dual characteristics of technology," Research Policy, Elsevier, vol. 50(1).
    15. Alex J. Yang & Hongcun Gong & Yuhao Wang & Chao Zhang & Sanhong Deng, 2024. "Rescaling the disruption index reveals the universality of disruption distributions in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 561-580, January.
    16. Zhaoping Yan & Kaiyu Fan, 2024. "An integrated indicator for evaluating scientific papers: considering academic impact and novelty," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6909-6929, November.
    17. Chen, Wei & Yan, Yan, 2023. "New components and combinations: The perspective of the internal collaboration networks of scientific teams," Journal of Informetrics, Elsevier, vol. 17(2).
    18. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    19. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    20. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:18:y:2024:i:4:s1751157724000993. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.