IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v126y2021i2d10.1007_s11192-020-03801-1.html
   My bibliography  Save this article

The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation

Author

Listed:
  • Gabriel-Alexandru Vȋiu

    (National University of Political Studies and Public Administration)

  • Mihai Păunescu

    (National University of Political Studies and Public Administration)

Abstract

Journal impact factor (JIF) quartiles are often used as a convenient means of conducting research evaluation, abstracting the underlying JIF values. We highlight and investigate an intrinsic problem associated with this approach: the differences between quartile boundary JIF values are usually very small and often so small that journals in different quartiles cannot be considered meaningfully different with respect to impact. By systematically investigating JIF values in recent editions of the Journal Citation Reports (JCR) we determine it is typical to see between 10 and 30% poorly differentiated journals in the JCR categories. Social sciences are more affected than science categories. However, this global result conceals important variation and we also provide a detailed account of poor quartile boundary differentiation by constructing in-depth local quartile similarity profiles for each JCR category. Further systematic analyses show that poor quartile boundary differentiation tends to follow poor overall differentiation which naturally varies by field. In addition, in most categories the journals that experience a quartile shift are the same journals that are poorly differentiated. Our work provides sui generis documentation of the continuing phenomenon of impact factor inflation and also explains and reinforces some recent findings on the ranking stability of journals and on the JIF-based comparison of papers. Conceptually there is a fundamental problem in the fact that JIF quartile classes artificially magnify underlying differences that can be insignificant. We in fact argue that the singular use of JIF quartiles is a second order ecological fallacy. We recommend the abandonment of the quartiles reification as an independent method for the research assessment of individual scholars.

Suggested Citation

  • Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
  • Handle: RePEc:spr:scient:v:126:y:2021:i:2:d:10.1007_s11192-020-03801-1
    DOI: 10.1007/s11192-020-03801-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03801-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03801-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2018. "Double rank analysis for research assessment," Journal of Informetrics, Elsevier, vol. 12(1), pages 31-41.
    2. Juan Miguel Campanario, 2014. "The effect of citations on the significance of decimal places in the computation of journal impact factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 289-298, May.
    3. Wang, Qi & Waltman, Ludo, 2016. "Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 347-364.
    4. Ruth Müller & Sarah de Rijcke, 2017. "Exploring the epistemic impacts of academic performance indicators in the life sciences," Research Evaluation, Oxford University Press, vol. 26(3), pages 157-168.
    5. Pedro Albarrán & Juan A. Crespo & Ignacio Ortuño & Javier Ruiz-Castillo, 2011. "The skewness of science in 219 sub-fields and a number of aggregates," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(2), pages 385-397, August.
    6. Benjamin M. Althouse & Jevin D. West & Carl T. Bergstrom & Theodore Bergstrom, 2009. "Differences in impact factor across fields and over time," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(1), pages 27-34, January.
    7. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    8. George A. Lozano & Vincent Larivière & Yves Gingras, 2012. "The weakening relationship between the impact factor and papers' citations in the digital age," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(11), pages 2140-2145, November.
    9. Fei Shu & Wei Quan & Bikun Chen & Junping Qiu & Cassidy R. Sugimoto & Vincent Larivière, 2020. "The role of Web of Science publications in China’s tenure system," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1683-1695, March.
    10. David I. Stern, 2013. "Uncertainty Measures for Economics Journal Impact Factors," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 173-189, March.
    11. Ruiz-Castillo, Javier & Costas, Rodrigo, 2018. "Individual and field citation distributions in 29 broad scientific fields," Journal of Informetrics, Elsevier, vol. 12(3), pages 868-892.
    12. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    13. Éric Archambault & Vincent Larivière, 2009. "History of the journal impact factor: Contingencies and consequences," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(3), pages 635-649, June.
    14. Pajić, Dejan, 2015. "On the stability of citation-based journal rankings," Journal of Informetrics, Elsevier, vol. 9(4), pages 990-1006.
    15. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    16. Ruth Müller & Sarah de Rijcke, 2017. "Thinking with Indicators. Exploring the Epistemic Impacts of Academic Performance Indicators in the Life Sciences," Research Evaluation, Oxford University Press, vol. 26(4), pages 361-361.
    17. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    18. Ruben Miranda & Esther Garcia-Carpintero, 2019. "Comparison of the share of documents and citations from different quartile journals in 25 research areas," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 479-501, October.
    19. A. I. Pudovkin & Eugene Garfield, 2012. "Rank normalization of impact factors will resolve Vanclay’s dilemma with TRIF," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 409-412, August.
    20. George A. Lozano & Vincent Larivière & Yves Gingras, 2012. "The weakening relationship between the impact factor and papers' citations in the digital age," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(11), pages 2140-2145, November.
    21. J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia & J. Martinez-Baena, 2012. "On first quartile journals which are not of highest impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 925-943, March.
    22. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    23. Jerome K. Vanclay, 2009. "Bias in the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(1), pages 3-12, January.
    24. Weishu Liu & Guangyuan Hu & Mengdi Gu, 2016. "The probability of publishing in first-quartile journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 1273-1276, March.
    25. Dag W. Aksnes, 2003. "A macro study of self-citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(2), pages 235-246, February.
    26. Thelwall, Mike, 2016. "Are the discretised lognormal and hooked power law distributions plausible for citation data?," Journal of Informetrics, Elsevier, vol. 10(2), pages 454-470.
    27. Jerome K. Vanclay, 2012. "Impact factor: outdated artefact or stepping-stone to journal certification?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 211-238, August.
    28. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    29. Lutz Bornmann, 2017. "Confidence intervals for Journal Impact Factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(3), pages 1869-1871, June.
    30. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    2. Ruben Miranda & Esther Garcia-Carpintero, 2019. "Comparison of the share of documents and citations from different quartile journals in 25 research areas," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 479-501, October.
    3. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    4. Zsolt Kohus & Márton Demeter & László Kun & Eszter Lukács & Katalin Czakó & Gyula Péter Szigeti, 2022. "A Study of the Relation between Byline Positions of Affiliated/Non-Affiliated Authors and the Scientific Impact of European Universities in Times Higher Education World University Rankings," Sustainability, MDPI, vol. 14(20), pages 1-14, October.
    5. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    6. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    7. Alonso Rodríguez-Navarro & Ricardo Brito, 2019. "Probability and expected frequency of breakthroughs: basis and use of a robust method of research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 213-235, April.
    8. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    9. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Costa, 2023. "Correlating article citedness and journal impact: an empirical investigation by field on a large-scale dataset," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1877-1894, March.
    10. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    11. Gerson Pech & Catarina Delgado, 2020. "Percentile and stochastic-based approach to the comparison of the number of citations of articles indexed in different bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 223-252, April.
    12. Pech, Gerson & Delgado, Catarina, 2021. "Screening the most highly cited papers in longitudinal bibliometric studies and systematic literature reviews of a research field or journal: Widespread used metrics vs a percentile citation-based app," Journal of Informetrics, Elsevier, vol. 15(3).
    13. David I Stern, 2014. "High-Ranked Social Science Journal Articles Can Be Identified from Early Citation Information," PLOS ONE, Public Library of Science, vol. 9(11), pages 1-11, November.
    14. Hamdi A. Al-Jamimi & Galal M. BinMakhashen & Lutz Bornmann & Yousif Ahmed Al Wajih, 2023. "Saudi Arabia research: academic insights and trend analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(10), pages 5595-5627, October.
    15. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2024. "Rank analysis of most cited publications, a new approach for research assessments," Journal of Informetrics, Elsevier, vol. 18(2).
    16. Yves Fassin, 2021. "Does the Financial Times FT50 journal list select the best management and economics journals?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5911-5943, July.
    17. Michael McAleer & Judit Olah & Jozsef Popp, 2018. "Pros and Cons of the Impact Factor in a Rapidly Changing Digital World," Tinbergen Institute Discussion Papers 18-014/III, Tinbergen Institute.
    18. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    19. de Carvalho, Gustavo Dambiski Gomes & Sokulski, Carla Cristiane & da Silva, Wesley Vieira & de Carvalho, Hélio Gomes & de Moura, Rafael Vignoli & de Francisco, Antonio Carlos & da Veiga, Claudimar Per, 2020. "Bibliometrics and systematic reviews: A comparison between the Proknow-C and the Methodi Ordinatio," Journal of Informetrics, Elsevier, vol. 14(3).
    20. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:126:y:2021:i:2:d:10.1007_s11192-020-03801-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.