IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v16y2022i3s175115772200058x.html
   My bibliography  Save this article

A new method for measuring the originality of academic articles based on knowledge units in semantic networks

Author

Listed:
  • Hou, Jianhua
  • Wang, Dongyi
  • Li, Jing

Abstract

Research on the evaluation of the quality of academic papers is attracting more attention from scholars in scientometrics. However, most previous researches have assessed paper quality based on external indicators, such as citations, which failed to account for the content of the research. To that end, this paper proposed a new method for measuring a paper's originality. The method was based on knowledge units in semantic networks, focusing on the relationship and semantic similarity of different knowledge units. Connectivity and path similarity between different content elements were used in particular networks as indicators of originality. This study used papers published between 2014 and 2018 in three categories (i.e. Library & Information Science, Educational Psychology, and Carbon Nanotubes) and divided their content into three parts (i.e. research topics, research methods and research results). It was found that the originality in all categories increase each year. Furthermore, a comparison of our new method with previous models of citation network analysis and knowledge combination analysis showed that our new method is better than those previous methods when used in measuring originality.

Suggested Citation

  • Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
  • Handle: RePEc:eee:infome:v:16:y:2022:i:3:s175115772200058x
    DOI: 10.1016/j.joi.2022.101306
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115772200058X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2022.101306?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Haunschild, Robin & Adams, Jonathan, 2019. "Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)," Journal of Informetrics, Elsevier, vol. 13(1), pages 325-340.
    2. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    3. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
    4. Bruce Kogut & Udo Zander, 1992. "Knowledge of the Firm, Combinative Capabilities, and the Replication of Technology," Organization Science, INFORMS, vol. 3(3), pages 383-397, August.
    5. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    6. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    7. Alan L Porter & J David Roessner & Alex S Cohen & Marty Perreault, 2006. "Interdisciplinary research: meaning, metrics and nurture," Research Evaluation, Oxford University Press, vol. 15(3), pages 187-195, December.
    8. Terrence A. Brooks, 1986. "Evidence of complex citer motivations," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 37(1), pages 34-36, January.
    9. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    10. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    11. Yan Yan & Shanwu Tian & Jingjing Zhang, 2020. "The impact of a paper’s new combinations and new components on its citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 895-913, February.
    12. Chaomei Chen, 2006. "CiteSpace II: Detecting and visualizing emerging trends and transient patterns in scientific literature," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(3), pages 359-377, February.
    13. Michael H. MacRoberts & Barbara R. MacRoberts, 1989. "Problems of citation analysis: A critical review," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 40(5), pages 342-349, September.
    14. Bertoli-Barsotti, Lucio & Lando, Tommaso, 2015. "On a formula for the h-index," Journal of Informetrics, Elsevier, vol. 9(4), pages 762-776.
    15. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    16. Xiaoying Li & Suyuan Peng & Jian Du, 2021. "Towards medical knowmetrics: representing and computing medical knowledge using semantic predications as the knowledge unit and the uncertainty as the knowledge context," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 6225-6251, July.
    17. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    18. Ostermaier, Andreas & Uhl, Matthias, 2020. "Performance evaluation and creativity: Balancing originality and usefulness," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 86(C).
    19. Trapido, Denis, 2015. "How novelty in knowledge earns recognition: The role of consistent identities," Research Policy, Elsevier, vol. 44(8), pages 1488-1500.
    20. Lin, Yiling & Evans, James A. & Wu, Lingfei, 2022. "New directions in science emerge from disconnection and discord," Journal of Informetrics, Elsevier, vol. 16(1).
    21. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    22. Chao Min & Ying Ding & Jiang Li & Yi Bu & Lei Pei & Jianjun Sun, 2018. "Innovation or imitation: The diffusion of citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(10), pages 1271-1282, October.
    23. Chaomei Chen, 2012. "Predictive effects of structural variation on citation counts," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(3), pages 431-449, March.
    24. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    25. Chaomei Chen, 2012. "Predictive effects of structural variation on citation counts," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(3), pages 431-449, March.
    26. Weitzman, Martin L, 1996. "Hybridizing Growth Theory," American Economic Review, American Economic Association, vol. 86(2), pages 207-212, May.
    27. Lin Zhang & Ronald Rousseau & Wolfgang Glänzel, 2016. "Diversity of references as an indicator of the interdisciplinarity of journals: Taking similarity between subject fields into account," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(5), pages 1257-1265, May.
    28. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    29. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    30. Bornmann, Lutz, 2014. "Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics," Journal of Informetrics, Elsevier, vol. 8(4), pages 895-903.
    31. Lee Fleming, 2001. "Recombinant Uncertainty in Technological Search," Management Science, INFORMS, vol. 47(1), pages 117-132, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yining Wang & Qiang Wu & Liangyu Li, 2024. "Examining the influence of women scientists on scientific impact and novelty: insights from top business journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(6), pages 3517-3542, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    2. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    3. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    4. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    5. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    6. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    7. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    8. Wang, Cheng-Jun & Yan, Lihan & Cui, Haochuan, 2023. "Unpacking the essential tension of knowledge recombination: Analyzing the impact of knowledge spanning on citation impact and disruptive innovation," Journal of Informetrics, Elsevier, vol. 17(4).
    9. Boris Forthmann & Mark A. Runco, 2020. "An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators," Publications, MDPI, vol. 8(2), pages 1-16, June.
    10. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    11. Min, Chao & Bu, Yi & Sun, Jianjun, 2021. "Predicting scientific breakthroughs based on knowledge structure variations," Technological Forecasting and Social Change, Elsevier, vol. 164(C).
    12. Guo, Liying & Wang, Yang & Li, Meiling, 2024. "Exploration, exploitation and funding success: Evidence from junior scientists supported by the Chinese Young Scientists Fund," Journal of Informetrics, Elsevier, vol. 18(2).
    13. Zhang, Xinyuan & Xie, Qing & Song, Min, 2021. "Measuring the impact of novelty, bibliometric, and academic-network factors on citation count using a neural network," Journal of Informetrics, Elsevier, vol. 15(2).
    14. Kuniko Matsumoto & Sotaro Shibayama & Byeongwoo Kang & Masatsura Igami, 2021. "Introducing a novelty indicator for scientific research: validating the knowledge-based combinatorial approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6891-6915, August.
    15. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    16. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    17. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    18. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    19. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    20. Ke, Qing, 2020. "Technological impact of biomedical research: The role of basicness and novelty," Research Policy, Elsevier, vol. 49(7).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:16:y:2022:i:3:s175115772200058x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.