IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v14y2020i4s1751157720300985.html
   My bibliography  Save this article

Using the full-text content of academic articles to identify and evaluate algorithm entities in the domain of natural language processing

Author

Listed:
  • Wang, Yuzhuo
  • Zhang, Chengzhi

Abstract

In the era of big data, the advancement, improvement, and application of algorithms in academic research have played an important role in promoting the development of different disciplines. Academic papers in various disciplines, especially computer science, contain a large number of algorithms. Identifying the algorithms from the full-text content of papers can determine popular or classical algorithms in a specific field and help scholars gain a comprehensive understanding of the algorithms and even the field. To this end, this article takes the field of natural language processing (NLP) as an example and identifies algorithms from academic papers in the field. A dictionary of algorithms is constructed by manually annotating the contents of papers, and sentences containing algorithms in the dictionary are extracted through dictionary-based matching. The number of articles mentioning an algorithm is used as an indicator to analyze the influence of that algorithm. Our results reveal the algorithm with the highest influence in NLP papers and show that classification algorithms represent the largest proportion among the high-impact algorithms. In addition, the evolution of the influence of algorithms reflects the changes in research tasks and topics in the field, and the changes in the influence of different algorithms show different trends. As a preliminary exploration, this paper conducts an analysis of the impact of algorithms mentioned in the academic text, and the results can be used as training data for the automatic extraction of large-scale algorithms in the future. The methodology in this paper is domain-independent and can be applied to other domains.

Suggested Citation

  • Wang, Yuzhuo & Zhang, Chengzhi, 2020. "Using the full-text content of academic articles to identify and evaluate algorithm entities in the domain of natural language processing," Journal of Informetrics, Elsevier, vol. 14(4).
  • Handle: RePEc:eee:infome:v:14:y:2020:i:4:s1751157720300985
    DOI: 10.1016/j.joi.2020.101091
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157720300985
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2020.101091?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Pan, Xuelian & Yan, Erjia & Cui, Ming & Hua, Weina, 2019. "How important is software to library and information science research? A content analysis of full-text publications," Journal of Informetrics, Elsevier, vol. 13(1), pages 397-406.
    2. Bo Yang & Ronald Rousseau & Xue Wang & Shuiqing Huang, 2018. "How important is scientific software in bioinformatics research? A comparative study between international and Chinese research communities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(9), pages 1122-1133, September.
    3. Hui-Zhen Fu & Yuh-Shan Ho, 2013. "Comparison of independent research of China’s top universities using bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(1), pages 259-276, July.
    4. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2011. "National research assessment exercises: a comparison of peer review and bibliometrics rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 929-941, December.
    5. Pan, Xuelian & Yan, Erjia & Wang, Qianqian & Hua, Weina, 2015. "Assessing the impact of software on science: A bootstrapped learning of software entities in full-text papers," Journal of Informetrics, Elsevier, vol. 9(4), pages 860-871.
    6. Ricardo Cartes-Velásquez & Carlos Manterola Delgado, 2014. "Bibliometric analysis of articles published in ISI dental journals, 2007–2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 2223-2233, March.
    7. Mengnan Zhao & Erjia Yan & Kai Li, 2018. "Data set mentions and citations: A content analysis of full†text publications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(1), pages 32-46, January.
    8. Lutz Bornmann & Rüdiger Mutz, 2015. "Growth rates of modern science: A bibliometric analysis based on the number of publications and cited references," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2215-2222, November.
    9. James Howison & Julia Bullard, 2016. "Software in the scientific literature: Problems with seeing, finding, and using software mentioned in the biology literature," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(9), pages 2137-2155, September.
    10. Rongying Zhao & Mingkun Wei, 2017. "Impact evaluation of open source software: an Altmetrics perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(2), pages 1017-1033, February.
    11. Li, Kai & Yan, Erjia & Feng, Yuanyuan, 2017. "How is R cited in research outputs? Structure, impacts, and citation standard," Journal of Informetrics, Elsevier, vol. 11(4), pages 989-1002.
    12. Xuelian Pan & Erjia Yan & Weina Hua, 2016. "Disciplinary differences of software use and impact in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1593-1610, December.
    13. Li, Kai & Yan, Erjia, 2018. "Co-mention network of R packages: Scientific impact and clustering structure," Journal of Informetrics, Elsevier, vol. 12(1), pages 87-100.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Masood, Muhammad Ali & Abbasi, Rabeeh Ayaz, 2021. "Using graph embedding and machine learning to identify rebels on twitter," Journal of Informetrics, Elsevier, vol. 15(1).
    2. Yuzhuo Wang & Chengzhi Zhang & Kai Li, 2022. "A review on method entities in the academic literature: extraction, evaluation, and application," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2479-2520, May.
    3. Xiaorui Jiang & Jingqiang Chen, 2023. "Contextualised segment-wise citation function classification," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5117-5158, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pan, Xuelian & Yan, Erjia & Cui, Ming & Hua, Weina, 2018. "Examining the usage, citation, and diffusion patterns of bibliometric mapping software: A comparative study of three tools," Journal of Informetrics, Elsevier, vol. 12(2), pages 481-493.
    2. Enrique Orduña-Malea & Rodrigo Costas, 2021. "Link-based approach to study scientific software usage: the case of VOSviewer," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 8153-8186, September.
    3. Yuzhuo Wang & Chengzhi Zhang & Kai Li, 2022. "A review on method entities in the academic literature: extraction, evaluation, and application," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2479-2520, May.
    4. Pan, Xuelian & Yan, Erjia & Cui, Ming & Hua, Weina, 2019. "How important is software to library and information science research? A content analysis of full-text publications," Journal of Informetrics, Elsevier, vol. 13(1), pages 397-406.
    5. Li, Kai & Chen, Pei-Ying & Yan, Erjia, 2019. "Challenges of measuring software impact through citations: An examination of the lme4 R package," Journal of Informetrics, Elsevier, vol. 13(1), pages 449-461.
    6. Lu Jiang & Xinyu Kang & Shan Huang & Bo Yang, 2022. "A refinement strategy for identification of scientific software from bioinformatics publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3293-3316, June.
    7. Robert Tomaszewski, 2023. "Visibility, impact, and applications of bibliometric software tools through citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 4007-4028, July.
    8. Li, Kai & Yan, Erjia, 2018. "Co-mention network of R packages: Scientific impact and clustering structure," Journal of Informetrics, Elsevier, vol. 12(1), pages 87-100.
    9. Alsudais, Abdulkareem, 2021. "In-code citation practices in open research software libraries," Journal of Informetrics, Elsevier, vol. 15(2).
    10. Bikun Chen & Dannan Deng & Zhouyan Zhong & Chengzhi Zhang, 2020. "Exploring linguistic characteristics of highly browsed and downloaded academic articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1769-1790, March.
    11. Caifan Du & Johanna Cohoon & Patrice Lopez & James Howison, 2021. "Softcite dataset: A dataset of software mentions in biomedical and economic research publications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 72(7), pages 870-884, July.
    12. Xuelian Pan & Erjia Yan & Weina Hua, 2016. "Disciplinary differences of software use and impact in scientific literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1593-1610, December.
    13. Ramona Weinrich, 2019. "Opportunities for the Adoption of Health-Based Sustainable Dietary Patterns: A Review on Consumer Research of Meat Substitutes," Sustainability, MDPI, vol. 11(15), pages 1-15, July.
    14. Piers Steel & Sjoerd Beugelsdijk & Herman Aguinis, 2021. "The anatomy of an award-winning meta-analysis: Recommendations for authors, reviewers, and readers of meta-analytic reviews," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 52(1), pages 23-44, February.
    15. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    16. Augusteijn, Hilde Elisabeth Maria & van Aert, Robbie Cornelis Maria & van Assen, Marcel A. L. M., 2021. "Posterior Probabilities of Effect Sizes and Heterogeneity in Meta-Analysis: An Intuitive Approach of Dealing with Publication Bias," OSF Preprints avkgj, Center for Open Science.
    17. Ruhua Huang & Yuting Huang & Fan Qi & Leyi Shi & Baiyang Li & Wei Yu, 2022. "Exploring the characteristics of special issues: distribution, topicality, and citation impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5233-5256, September.
    18. Neal R. Haddaway & Max W. Callaghan & Alexandra M. Collins & William F. Lamb & Jan C. Minx & James Thomas & Denny John, 2020. "On the use of computer‐assistance to facilitate systematic mapping," Campbell Systematic Reviews, John Wiley & Sons, vol. 16(4), December.
    19. Vincent Raoult, 2020. "How Many Papers Should Scientists Be Reviewing? An Analysis Using Verified Peer Review Reports," Publications, MDPI, vol. 8(1), pages 1-9, January.
    20. Eloy López-Meneses & Esteban Vázquez-Cano & Mariana-Daniela González-Zamar & Emilio Abad-Segura, 2020. "Socioeconomic Effects in Cyberbullying: Global Research Trends in the Educational Context," IJERPH, MDPI, vol. 17(12), pages 1-31, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:14:y:2020:i:4:s1751157720300985. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.