IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i5d10.1007_s11192-022-04350-5.html
   My bibliography  Save this article

Measuring the innovation of method knowledge elements in scientific literature

Author

Listed:
  • Zhongyi Wang

    (Central China Normal University)

  • Keying Wang

    (Central China Normal University)

  • Jiyue Liu

    (Central China Normal University)

  • Jing Huang

    (Wuhan Polytechnic)

  • Haihua Chen

    (University of North Texas)

Abstract

Interest in assessing research impacts is increasing due to its importance for informing actions and funding allocation decisions. The level of innovation (also called “innovation degree” in the following article), one of the most essential factors that affect scientific literature’s impact, has also received increasing attention. However, current studies mainly focus on the overall innovation degree of scientific literature at the macro level, while ignoring the innovation degree of a specific knowledge element (KE), such as the method knowledge element (MKE). A macro level view causes difficulties in identifying which part of the scientific literature contains the innovations. To bridge this gap, a more fine-grained evaluation of academic papers is urgent. The fine-grained evaluation method can ensure the quality of a paper before being published and identify useful knowledge content in a paper for academic users. Different KEs can be used to perform the fine-grained evaluation. However, MKEs are usually considered as one of the most essential knowledge elements among all KEs. Therefore, this study proposes a framework to measure the innovation degree of method knowledge elements (MIDMKE) in scientific literature. In this framework, we first extract the MKEs using a rule-based approach and generate a cloud drop for each MKE using the biterm topic model (BTM). The generated cloud drop is then used to create a method knowledge cloud (MKC) for each MKE. Finally, we calculate the innovation score of a MKE based on the similarity between it and other MKEs of its type. Our empirical study on a China National Knowledge Infrastructure (CNKI) academic literature dataset shows the proposed approach can measure the innovation of MKEs in scientific literature effectively. Our proposed method is useful for both reviewers and funding agencies to assess the quality of academic papers. The dataset, the code for implementation the algorithms, and the complete experiment results will be released at: https://github.com/haihua0913/midmke .

Suggested Citation

  • Zhongyi Wang & Keying Wang & Jiyue Liu & Jing Huang & Haihua Chen, 2022. "Measuring the innovation of method knowledge elements in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2803-2827, May.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:5:d:10.1007_s11192-022-04350-5
    DOI: 10.1007/s11192-022-04350-5
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04350-5
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04350-5?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Mingyang Wang & Jiaqi Zhang & Shijia Jiao & Xiangrong Zhang & Na Zhu & Guangsheng Chen, 2020. "Important citation identification by exploiting the syntactic and contextual information of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2109-2129, December.
    2. Yujia Zhai & Ying Ding & Fang Wang, 2018. "Measuring the diffusion of an innovation: A citation analysis," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(3), pages 368-379, March.
    3. Trapido, Denis, 2015. "How novelty in knowledge earns recognition: The role of consistent identities," Research Policy, Elsevier, vol. 44(8), pages 1488-1500.
    4. Uddin, Shahadat & Khan, Arif, 2016. "The impact of author-selected keywords on citation counts," Journal of Informetrics, Elsevier, vol. 10(4), pages 1166-1177.
    5. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    6. Ma, Li-Ching & Li, Han-Lin, 2008. "A fuzzy ranking method with range reduction techniques," European Journal of Operational Research, Elsevier, vol. 184(3), pages 1032-1043, February.
    7. David Colquhoun, 2003. "Challenging the tyranny of impact factors," Nature, Nature, vol. 423(6939), pages 479-479, May.
    8. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    9. ., 2008. "The Method of Analysis," Chapters, in: Law, Informal Rules and Economic Performance, chapter 10, Edward Elgar Publishing.
    10. Andrén, Daniela & Andrén, Thomas, 2008. "Part-Time Sick Leave as a Treatment Method?," Working Papers in Economics 320, University of Gothenburg, Department of Economics.
    11. Narongrit Sombatsompop & Apisit Kositchaiyong & Teerasak Markpin & Sekson Inrit, 2006. "Scientific evaluations of citation quality of international research articles in the SCI database: Thailand case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(3), pages 521-535, March.
    12. Mikko Packalen & Jay Bhattacharya, 2019. "Age and the Trying Out of New Ideas," Journal of Human Capital, University of Chicago Press, vol. 13(2), pages 341-373.
    13. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    14. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2013. "Absolute and specific measures of research group excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 115-127, April.
    15. Yan, Erjia, 2014. "Research dynamics: Measuring the continuity and popularity of research topics," Journal of Informetrics, Elsevier, vol. 8(1), pages 98-110.
    16. Lutz Bornmann & Gerlind Wallon & Anna Ledin, 2008. "Is the h index related to (standard) bibliometric measures and to the assessments by peers? An investigation of the h index by using molecular life sciences data," Research Evaluation, Oxford University Press, vol. 17(2), pages 149-156, June.
    17. Hulya Behret & Cigdem Altin Gumussoy, 2012. "A fuzzy integrated approach for the selection of academic papers to a special issue," International Journal of Applied Management Science, Inderscience Enterprises Ltd, vol. 4(4), pages 371-384.
    18. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2011. "National research assessment exercises: a comparison of peer review and bibliometrics rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 929-941, December.
    19. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    20. Kosmulski, Marek, 2011. "Successful papers: A new idea in evaluation of scientific output," Journal of Informetrics, Elsevier, vol. 5(3), pages 481-485.
    21. Natsuo Onodera & Fuyuki Yoshikane, 2015. "Factors affecting citation rates of research articles," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(4), pages 739-764, April.
    22. Martin Reinhart, 2009. "Peer review of grant applications in biology and medicine. Reliability, fairness, and validity," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 789-809, December.
    23. Iman Tahamtan & Askar Safipour Afshar & Khadijeh Ahamdzadeh, 2016. "Factors affecting number of citations: a comprehensive review of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1195-1225, June.
    24. Liwei Cai & Jiahao Tian & Jiaying Liu & Xiaomei Bai & Ivan Lee & Xiangjie Kong & Feng Xia, 2019. "Scholarly impact assessment: a survey of citation weighting solutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 453-478, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jiang, Zhuoren & Lin, Tianqianjin & Huang, Cui, 2023. "Deep representation learning of scientific paper reveals its potential scholarly impact," Journal of Informetrics, Elsevier, vol. 17(1).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ruan, Xuanmin & Lyu, Dongqing & Gong, Kaile & Cheng, Ying & Li, Jiang, 2021. "Rethinking the disruption index as a measure of scientific and technological advances," Technological Forecasting and Social Change, Elsevier, vol. 172(C).
    2. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    3. Guoqiang Liang & Haiyan Hou & Qiao Chen & Zhigang Hu, 2020. "Diffusion and adoption: an explanatory model of “question mark” and “rising star” articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 219-232, July.
    4. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    5. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    6. António Osório & Lutz Bornmann, 2021. "On the disruptive power of small-teams research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 117-133, January.
    7. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    8. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    9. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    10. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    11. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Nov 2023.
    12. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    13. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    14. Hassan Danaeefard, 2022. "Implication studies: a methodological framework," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(5), pages 3159-3188, October.
    15. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    16. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    17. Liu, Jialin & Chen, Hongkan & Liu, Zhibo & Bu, Yi & Gu, Weiye, 2022. "Non-linearity between referencing behavior and citation impact: A large-scale, discipline-level analysis," Journal of Informetrics, Elsevier, vol. 16(3).
    18. Min, Chao & Bu, Yi & Sun, Jianjun, 2021. "Predicting scientific breakthroughs based on knowledge structure variations," Technological Forecasting and Social Change, Elsevier, vol. 164(C).
    19. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    20. , Aisdl, 2020. "Sustainability model of Vietnamese women entrepreneurship," OSF Preprints kjmdr, Center for Open Science.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:5:d:10.1007_s11192-022-04350-5. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.