IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v128y2023i7d10.1007_s11192-023-04734-1.html
   My bibliography  Save this article

Does citation polarity help evaluate the quality of academic papers?

Author

Listed:
  • Linhong Xu

    (Dalian University of Technology
    Software Institute, Dalian University of Foreign Languages)

  • Kun Ding

    (Dalian University of Technology)

  • Yuan Lin

    (Dalian University of Technology)

  • Chunbo Zhang

    (Dalian University of Technology)

Abstract

Citation frequency is an important metric for the evaluation of academic papers, but it assumes that all citations are of equal value. The purpose of this study is to determine the validity of citation polarity, which contains evaluative information such as criticism or praise, in the evaluation of paper quality. In this paper, 3538 citation sentences in papers from ACL conferences were selected and manually annotated for citation polarity. They were divided into best paper group and matching paper group, and tested in heterologous pairs to determine whether there were differences in the positive and negative citations of the two groups, and to further investigate the trend of citation polarity with the increase of citation window. The results of the study showed that the best paper and the matching paper had significant differences in the number of positive and negative citations, and the mean and median values of positive citations in the best group were about 1.5 times higher than those in the matching group. As the citation window increased, the best papers maintained both positive and negative citation dominance over 5 years, and the peak citation in the best group was about three times higher than that in the matching group. Therefore, the metric of citation polarity can help evaluate the quality of papers and provide new ideas for scientific and objective evaluation of academic papers.

Suggested Citation

  • Linhong Xu & Kun Ding & Yuan Lin & Chunbo Zhang, 2023. "Does citation polarity help evaluate the quality of academic papers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 4065-4087, July.
  • Handle: RePEc:spr:scient:v:128:y:2023:i:7:d:10.1007_s11192-023-04734-1
    DOI: 10.1007/s11192-023-04734-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04734-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04734-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Terrence A. Brooks, 1986. "Evidence of complex citer motivations," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 37(1), pages 34-36, January.
    2. Lutz Bornmann, 2017. "Is collaboration among scientists related to the citation impact of papers because their quality increases with collaboration? An analysis based on data from F1000Prime and normalized citation scores," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(4), pages 1036-1047, April.
    3. Blaise Cronin & Lokman Meho, 2006. "Using the h‐index to rank influential information scientistss," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(9), pages 1275-1278, July.
    4. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    5. Lina Zhou & Uchechukwuka Amadi & Dongsong Zhang, 2020. "Is Self-Citation Biased? An Investigation via the Lens of Citation Polarity, Density, and Location," Information Systems Frontiers, Springer, vol. 22(1), pages 77-90, February.
    6. Linhong Xu & Kun Ding & Yuan Lin, 2022. "Do negative citations reduce the impact of cited papers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 1161-1186, February.
    7. Negin Salimi, 2017. "Quality assessment of scientific outputs using the BWM," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 195-213, July.
    8. Niu, Qikai & Zhou, Jianlin & Zeng, An & Fan, Ying & Di, Zengru, 2016. "Which publication is your representative work?," Journal of Informetrics, Elsevier, vol. 10(3), pages 842-853.
    9. Colin Macilwain, 2013. "Halt the avalanche of performance metrics," Nature, Nature, vol. 500(7462), pages 255-255, August.
    10. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    11. Frederique Bordignon, 2022. "Critical citations in knowledge construction and citation analysis: from paradox to definition," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 959-972, February.
    12. Agnieszka Geras & Grzegorz Siudem & Marek Gagolewski, 2020. "Should we introduce a dislike button for academic articles?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 71(2), pages 221-229, February.
    13. Rudolf Farys & Tobias Wolbring, 2017. "Matched control groups for modeling events in citation data: An illustration of nobel prize effects in citation networks," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(9), pages 2201-2210, September.
    14. Yuanyuan Liu & Qiang Wu & Shijie Wu & Yong Gao, 2021. "Weighted citation based on ranking-related contribution: a new index for evaluating article impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8653-8672, October.
    15. Hao Liao & Rui Xiao & Giulio Cimini & Matúš Medo, 2014. "Network-Driven Reputation in Online Scientific Communities," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-18, December.
    16. Frederique Bordignon, 2020. "Self-correction of science: a comparative study of negative citations and post-publication peer review," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1225-1239, August.
    17. Dunaiski, Marcel & Visser, Willem & Geldenhuys, Jaco, 2016. "Evaluating paper and author ranking algorithms using impact and contribution awards," Journal of Informetrics, Elsevier, vol. 10(2), pages 392-407.
    18. Donald O. Case & Georgeann M. Higgins, 2000. "How can we investigate citation behavior? A study of reasons for citing literature in communication," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 51(7), pages 635-645.
    19. Jianlin Zhou & An Zeng & Ying Fan & Zengru Di, 2018. "The representative works of scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(3), pages 1721-1732, December.
    20. Peng Zhang & Peiling Wang & Qiang Wu, 2018. "How are the best JASIST papers cited?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(6), pages 857-860, June.
    21. Erjia Yan & Zheng Chen & Kai Li, 2020. "Authors' status and the perceived quality of their work: Measuring citation sentiment change in nobel articles," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 71(3), pages 314-324, March.
    22. Muhammad Touseef Ikram & Muhammad Tanvir Afzal, 2019. "Aspect based citation sentiment analysis using linguistic patterns for better comprehension of scientific knowledge," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 73-95, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Linhong Xu & Kun Ding & Yuan Lin, 2022. "Do negative citations reduce the impact of cited papers?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 1161-1186, February.
    2. Bornmann, Lutz & Leydesdorff, Loet, 2015. "Does quality and content matter for citedness? A comparison with para-textual factors and over time," Journal of Informetrics, Elsevier, vol. 9(3), pages 419-429.
    3. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    4. Dangzhi Zhao & Andreas Strotmann, 2020. "Telescopic and panoramic views of library and information science research 2011–2018: a comparison of four weighting schemes for author co-citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 255-270, July.
    5. Wen Lou & Jiangen He & Lingxin Zhang & Zhijie Zhu & Yongjun Zhu, 2023. "Support behind the scenes: the relationship between acknowledgement, coauthor, and citation in Nobel articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5767-5790, October.
    6. Nigel Harwood, 2008. "Publication outlets and their effect on academic writers’ citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(2), pages 253-265, November.
    7. Shen, Hongquan & Xie, Juan & Ao, Weiyi & Cheng, Ying, 2022. "The continuity and citation impact of scientific collaboration with different gender composition," Journal of Informetrics, Elsevier, vol. 16(1).
    8. Shen, Hongquan & Cheng, Ying & Ju, Xiufang & Xie, Juan, 2022. "Rethinking the effect of inter-gender collaboration on research performance for scholars," Journal of Informetrics, Elsevier, vol. 16(4).
    9. Dongqing Lyu & Xuanmin Ruan & Juan Xie & Ying Cheng, 2021. "The classification of citing motivations: a meta-synthesis," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3243-3264, April.
    10. Heng Huang & Donghua Zhu & Xuefeng Wang, 2022. "Evaluating scientific impact of publications: combining citation polarity and purpose," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5257-5281, September.
    11. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    12. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    13. Binglu Wang & Yi Bu & Yang Xu, 2018. "A quantitative exploration on reasons for citing articles from the perspective of cited authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 675-687, August.
    14. Tanzila Ahmed & Ben Johnson & Charles Oppenheim & Catherine Peck, 2004. "Highly cited old papers and the reasons why they continue to be cited. Part II., The 1953 Watson and Crick article on the structure of DNA," Scientometrics, Springer;Akadémiai Kiadó, vol. 61(2), pages 147-156, October.
    15. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    16. Bikun Chen & Dannan Deng & Zhouyan Zhong & Chengzhi Zhang, 2020. "Exploring linguistic characteristics of highly browsed and downloaded academic articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1769-1790, March.
    17. Lina Zhou & Uchechukwuka Amadi & Dongsong Zhang, 2020. "Is Self-Citation Biased? An Investigation via the Lens of Citation Polarity, Density, and Location," Information Systems Frontiers, Springer, vol. 22(1), pages 77-90, February.
    18. Dangzhi Zhao & Andreas Strotmann, 2020. "Deep and narrow impact: introducing location filtered citation counting," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 503-517, January.
    19. Pardeep Sud & Mike Thelwall, 2014. "Evaluating altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1131-1143, February.
    20. Brady Lund & Amrollah Shamsi, 2023. "Examining the use of supportive and contrasting citations in different disciplines: a brief study using Scite (scite.ai) data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4895-4900, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:128:y:2023:i:7:d:10.1007_s11192-023-04734-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.