IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v12y2018i4p1133-1145.html
   My bibliography  Save this article

Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers

Author

Listed:
  • Hu, Zhigang
  • Tian, Wencan
  • Xu, Shenmeng
  • Zhang, Chunbo
  • Wang, Xianwen

Abstract

InCites Essential Science Indicators is becoming increasingly used to identify top-performing research and evaluate the impact of institutes. Unfortunately, our study shows that ESI indicators, as well as other normalized citation indicators, have the following flaws. First, the publication month and the online-to-print delay affect a paper’s probability of becoming a Highly Cited Paper (HCP). Papers published in the earlier months of the year are more likely to accumulate enough citation counts to rank at the top 1% compared with those published in later months of the year. Papers with longer online-to-print delays have an apparent advantage for being selected as HCPs. Research field normalizations lead to the third pitfall. Different research fields have different citation thresholds for HCPs, making research field classification important for a journal. In addition, the uniform thresholds for both articles and reviews in ESI affect the reliability of HCP selection because, on average, reviews tend to have higher citation rates than articles. ESI’s selection of HCPs provides an intuitive feel for the problems of normalized citation impact indicators, such as those provided in InCites and SciVal.

Suggested Citation

  • Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
  • Handle: RePEc:eee:infome:v:12:y:2018:i:4:p:1133-1145
    DOI: 10.1016/j.joi.2018.09.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157718301147
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2018.09.006?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Herranz, Neus & Ruiz-Castillo, Javier, 2012. "Sub-field normalization in the multiplicative case: Average-based citation indicators," Journal of Informetrics, Elsevier, vol. 6(4), pages 543-556.
    2. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    3. Anne-Wil Harzing, 2015. "Health warning: might contain multiple personalities—the problem of homonyms in Thomson Reuters Essential Science Indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 2259-2270, December.
    4. Ruiz-Castillo, Javier, 2013. "The comparison of classification-system-based normalization procedures with source normalization alternatives in Waltman and Van Eck (2013)," UC3M Working papers. Economics we1318, Universidad Carlos III de Madrid. Departamento de Economía.
    5. Kun-Yang Chuang & Ming-Huang Wang & Yuh-Shan Ho, 2011. "High-impact papers presented in the subject category of water resources in the essential science indicators database of the institute for scientific information," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 551-562, June.
    6. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    7. Didegah, Fereshteh & Thelwall, Mike, 2013. "Which factors help authors produce the highest impact research? Collaboration, journal and document properties," Journal of Informetrics, Elsevier, vol. 7(4), pages 861-873.
    8. Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
    9. Thelwall, Mike & Wilson, Paul, 2014. "Distributions for cited articles from individual subjects and years," Journal of Informetrics, Elsevier, vol. 8(4), pages 824-839.
    10. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    11. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    12. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    13. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    14. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    15. Donner, Paul, 2018. "Effect of publication month on citation impact," Journal of Informetrics, Elsevier, vol. 12(1), pages 330-343.
    16. Ruimin Ma & Chaoqun Ni & Junping Qiu, 2008. "Scientific research competitiveness of world universities in computer science," Scientometrics, Springer;Akadémiai Kiadó, vol. 76(2), pages 245-260, August.
    17. Moed, Henk F., 2010. "Measuring contextual citation impact of scientific journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 265-277.
    18. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    19. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    20. Opthof, Tobias & Leydesdorff, Loet, 2010. "Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance," Journal of Informetrics, Elsevier, vol. 4(3), pages 423-430.
    21. Hui-Zhen Fu & Kun-Yang Chuang & Ming-Huang Wang & Yuh-Shan Ho, 2011. "Characteristics of research in China assessed with Essential Science Indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 841-862, September.
    22. Edit Csajbók & Anna Berhidi & Lívia Vasas & András Schubert, 2007. "Hirsch-index for countries based on Essential Science Indicators data," Scientometrics, Springer;Akadémiai Kiadó, vol. 73(1), pages 91-117, October.
    23. Colliander, Cristian & Ahlgren, Per, 2011. "The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments," Journal of Informetrics, Elsevier, vol. 5(1), pages 101-113.
    24. Guang Yu & Xiao-Hong Wang & Da-Ren Yu, 2005. "The influence of publication delays on impact factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(2), pages 235-246, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Weishu Liu, 2021. "A matter of time: publication dates in Web of Science Core Collection," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 849-857, January.
    2. Assunta Di Vaio & Sohail Hasan & Rosa Palladino & Rohail Hassan, 2023. "The transition towards circular economy and waste within accounting and accountability models: a systematic literature review and conceptual framework," Environment, Development and Sustainability: A Multidisciplinary Approach to the Theory and Practice of Sustainable Development, Springer, vol. 25(1), pages 734-810, January.
    3. Sahar Mohamadi & Abbas Abbasi & Habib-Allah Ranaei Kordshouli & Kazem Askarifar, 2022. "Conceptualizing sustainable–responsible tourism indicators: an interpretive structural modeling approach," Environment, Development and Sustainability: A Multidisciplinary Approach to the Theory and Practice of Sustainable Development, Springer, vol. 24(1), pages 399-425, January.
    4. Junwen Zhu & Weishu Liu, 2020. "A tale of two databases: the use of Web of Science and Scopus in academic papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 321-335, April.
    5. Hu, Guangyuan & Ni, Rong & Tang, Li, 2022. "Do international nonstop flights foster influential research? Evidence from Sino-US scientific collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    6. Wang, Jingjing & Xu, Shuqi & Mariani, Manuel S. & Lü, Linyuan, 2021. "The local structure of citation networks uncovers expert-selected milestone papers," Journal of Informetrics, Elsevier, vol. 15(4).
    7. Tanja Mihalic & Sahar Mohamadi & Abbas Abbasi & Lóránt Dénes Dávid, 2021. "Mapping a Sustainable and Responsible Tourism Paradigm: A Bibliometric and Citation Network Analysis," Sustainability, MDPI, vol. 13(2), pages 1-22, January.
    8. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    9. Jianhua Hou & Da Ma, 2020. "How the high-impact papers formed? A study using data from social media and citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2597-2615, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    3. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    4. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    5. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    6. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    7. Liwei Cai & Jiahao Tian & Jiaying Liu & Xiaomei Bai & Ivan Lee & Xiangjie Kong & Feng Xia, 2019. "Scholarly impact assessment: a survey of citation weighting solutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 453-478, February.
    8. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    9. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    10. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    11. Ruiz-Castillo, Javier & Costas, Rodrigo, 2014. "The skewness of scientific productivity," Journal of Informetrics, Elsevier, vol. 8(4), pages 917-934.
    12. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    13. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    14. Cristian Colliander & Per Ahlgren, 2019. "Comparison of publication-level approaches to ex-post citation normalization," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 283-300, July.
    15. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    16. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "A comparison of two ways of evaluating research units working in different scientific fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 539-561, February.
    17. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    18. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    19. Dejian Yu & Sun Meng, 2018. "An overview of biomass energy research with bibliometric indicators," Energy & Environment, , vol. 29(4), pages 576-590, June.
    20. Bornmann, Lutz & Haunschild, Robin, 2022. "Empirical analysis of recent temporal dynamics of research fields: Annual publications in chemistry and related areas as an example," Journal of Informetrics, Elsevier, vol. 16(2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:12:y:2018:i:4:p:1133-1145. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.