IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v12y2018i4p1133-1145.html
   My bibliography  Save this article

Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers

Author

Listed:
  • Hu, Zhigang
  • Tian, Wencan
  • Xu, Shenmeng
  • Zhang, Chunbo
  • Wang, Xianwen

Abstract

InCites Essential Science Indicators is becoming increasingly used to identify top-performing research and evaluate the impact of institutes. Unfortunately, our study shows that ESI indicators, as well as other normalized citation indicators, have the following flaws. First, the publication month and the online-to-print delay affect a paper’s probability of becoming a Highly Cited Paper (HCP). Papers published in the earlier months of the year are more likely to accumulate enough citation counts to rank at the top 1% compared with those published in later months of the year. Papers with longer online-to-print delays have an apparent advantage for being selected as HCPs. Research field normalizations lead to the third pitfall. Different research fields have different citation thresholds for HCPs, making research field classification important for a journal. In addition, the uniform thresholds for both articles and reviews in ESI affect the reliability of HCP selection because, on average, reviews tend to have higher citation rates than articles. ESI’s selection of HCPs provides an intuitive feel for the problems of normalized citation impact indicators, such as those provided in InCites and SciVal.

Suggested Citation

  • Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
  • Handle: RePEc:eee:infome:v:12:y:2018:i:4:p:1133-1145
    DOI: 10.1016/j.joi.2018.09.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157718301147
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Herranz, Neus & Ruiz-Castillo, Javier, 2012. "Sub-field normalization in the multiplicative case: Average-based citation indicators," Journal of Informetrics, Elsevier, vol. 6(4), pages 543-556.
    2. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    3. Moed, Henk F., 2010. "Measuring contextual citation impact of scientific journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 265-277.
    4. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    5. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    6. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    7. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    8. Anne-Wil Harzing, 2015. "Health warning: might contain multiple personalities—the problem of homonyms in Thomson Reuters Essential Science Indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 2259-2270, December.
    9. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    10. Opthof, Tobias & Leydesdorff, Loet, 2010. "Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance," Journal of Informetrics, Elsevier, vol. 4(3), pages 423-430.
    11. Donner, Paul, 2018. "Effect of publication month on citation impact," Journal of Informetrics, Elsevier, vol. 12(1), pages 330-343.
    12. Ruimin Ma & Chaoqun Ni & Junping Qiu, 2008. "Scientific research competitiveness of world universities in computer science," Scientometrics, Springer;Akadémiai Kiadó, vol. 76(2), pages 245-260, August.
    13. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    14. Didegah, Fereshteh & Thelwall, Mike, 2013. "Which factors help authors produce the highest impact research? Collaboration, journal and document properties," Journal of Informetrics, Elsevier, vol. 7(4), pages 861-873.
    15. Hui-Zhen Fu & Kun-Yang Chuang & Ming-Huang Wang & Yuh-Shan Ho, 2011. "Characteristics of research in China assessed with Essential Science Indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 841-862, September.
    16. Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
    17. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    18. Edit Csajbók & Anna Berhidi & Lívia Vasas & András Schubert, 2007. "Hirsch-index for countries based on Essential Science Indicators data," Scientometrics, Springer;Akadémiai Kiadó, vol. 73(1), pages 91-117, October.
    19. Colliander, Cristian & Ahlgren, Per, 2011. "The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments," Journal of Informetrics, Elsevier, vol. 5(1), pages 101-113.
    20. Thelwall, Mike & Wilson, Paul, 2014. "Distributions for cited articles from individual subjects and years," Journal of Informetrics, Elsevier, vol. 8(4), pages 824-839.
    21. Guang Yu & Xiao-Hong Wang & Da-Ren Yu, 2005. "The influence of publication delays on impact factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(2), pages 235-246, August.
    22. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Jianhua Hou & Da Ma, 2020. "How the high-impact papers formed? A study using data from social media and citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2597-2615, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    3. Liwei Cai & Jiahao Tian & Jiaying Liu & Xiaomei Bai & Ivan Lee & Xiangjie Kong & Feng Xia, 2019. "Scholarly impact assessment: a survey of citation weighting solutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 453-478, February.
    4. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    5. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    6. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    7. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    8. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    9. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    10. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    11. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    12. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    13. Cristian Colliander & Per Ahlgren, 2019. "Comparison of publication-level approaches to ex-post citation normalization," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 283-300, July.
    14. Tolga Yuret, 2018. "Author-weighted impact factor and reference return ratio: can we attain more equality among fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2097-2111, September.
    15. Wang, Xing & Zhang, Zhihui, 2020. "Improving the reliability of short-term citation impact indicators by taking into account the correlation between short- and long-term citation impact," Journal of Informetrics, Elsevier, vol. 14(2).
    16. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "A comparison of two ways of evaluating research units working in different scientific fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 539-561, February.
    17. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    18. Li, Yunrong & Radicchi, Filippo & Castellano, Claudio & Ruiz-Castillo, Javier, 2013. "Quantitative evaluation of alternative field normalization procedures," Journal of Informetrics, Elsevier, vol. 7(3), pages 746-755.
    19. Larivière, Vincent & Gingras, Yves, 2011. "Averages of ratios vs. ratios of averages: An empirical analysis of four levels of aggregation," Journal of Informetrics, Elsevier, vol. 5(3), pages 392-399.
    20. Carusi, Chiara & Bianchi, Giuseppe, 2019. "Scientific community detection via bipartite scholar/journal graph co-clustering," Journal of Informetrics, Elsevier, vol. 13(1), pages 354-386.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:12:y:2018:i:4:p:1133-1145. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Nithya Sathishkumar). General contact details of provider: http://www.elsevier.com/locate/joi .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.