IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v119y2019i3d10.1007_s11192-019-03099-8.html
   My bibliography  Save this article

The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor

Author

Listed:
  • Loet Leydesdorff

    (University of Amsterdam)

  • Lutz Bornmann

    (Max Planck Society, Administrative Headquarters)

  • Jonathan Adams

    (King’s College London
    Institute for Scientific Information, Clarivate Analytics)

Abstract

We propose the I3* indicator as a non-parametric alternative to the journal impact factor (JIF) and h-index. We apply I3* to more than 10,000 journals. The results can be compared with other journal metrics. I3* is a promising variant within the general scheme of non-parametric I3 indicators introduced previously: I3* provides a single metric which correlates with both impact in terms of citations (c) and output in terms of publications (p). We argue for weighting using four percentile classes: the top-1% and top-10% as excellence indicators; the top-50% and bottom-50% as output indicators. Like the h-index, which also incorporates both c and p, I3*-values are size-dependent; however, division of I3* by the number of publications (I3*/N) provides a size-independent indicator which correlates strongly with the 2- and 5-year journal impact factors (JIF2 and JIF5). Unlike the h-index, I3* correlates significantly with both the total number of citations and publications. The values of I3* and I3*/N can be statistically tested against the expectation or against one another using chi-squared tests or effect sizes. A template (in Excel) is provided online for relevant tests.

Suggested Citation

  • Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
  • Handle: RePEc:spr:scient:v:119:y:2019:i:3:d:10.1007_s11192-019-03099-8
    DOI: 10.1007/s11192-019-03099-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-019-03099-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-019-03099-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ludo Waltman & Clara Calero‐Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan van Eck & Thed N. van Leeuwen & Anthony F.J. van Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    2. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, July.
    3. Lutz Bornmann, 2014. "How are excellent (highly cited) papers defined in bibliometrics? A quantitative analysis of the literature," Research Evaluation, Oxford University Press, vol. 23(2), pages 166-173.
    4. Lutz Bornmann & Rüdiger Mutz & Hans‐Dieter Daniel, 2008. "Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(5), pages 830-837, March.
    5. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    6. Fred Y. Ye & Loet Leydesdorff, 2014. "The “academic trace” of the performance matrix: A mathematical synthesis of the h-index and the integrated impact indicator (I3)," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(4), pages 742-750, April.
    7. G. Kreft & E. Leeuw, 1988. "The See-Saw Effect: a multilevel problem?," Quality & Quantity: International Journal of Methodology, Springer, vol. 22(2), pages 127-137, June.
    8. Loet Leydesdorff & Caroline S. Wagner & Lutz Bornmann, 2018. "Discontinuities in citation relations among journals: self-organized criticality as a model of scientific revolutions and change," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 623-644, July.
    9. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h‐index," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    10. Loet Leydesdorff & Lutz Bornmann, 2011. "Integrated impact indicators compared with impact factors: An alternative research design with policy implications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(11), pages 2133-2146, November.
    11. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    12. Loet Leydesdorff & Lutz Bornmann & John Mingers, 2019. "Statistical significance and effect sizes of differences among research universities at the level of nations and worldwide based on the leiden rankings," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(5), pages 509-525, May.
    13. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    14. Thierry Marchant, 2009. "An axiomatic characterization of the ranking based on the h-index and some other bibliometric rankings of authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 80(2), pages 325-342, August.
    15. David A. Pendlebury & Jonathan Adams, 2012. "Comments on a critique of the Thomson Reuters journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 395-401, August.
    16. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    17. Schneider, Jesper W., 2013. "Caveats for using statistical significance tests in research assessments," Journal of Informetrics, Elsevier, vol. 7(1), pages 50-62.
    18. E. Garfield & I. H. Sher, 1963. "New factors in the evaluation of scientific literature through citation indexing," American Documentation, Wiley Blackwell, vol. 14(3), pages 195-201, July.
    19. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    20. Éric Archambault & Vincent Larivière, 2009. "History of the journal impact factor: Contingencies and consequences," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(3), pages 635-649, June.
    21. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    22. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h-index," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    23. Tove Faber Frandsen & Ronald Rousseau, 2005. "Article impact calculated over arbitrary periods," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 56(1), pages 58-62, January.
    24. Antonoyiannakis, Manolis, 2018. "Impact Factors and the Central Limit Theorem: Why citation averages are scale dependent," Journal of Informetrics, Elsevier, vol. 12(4), pages 1072-1088.
    25. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    26. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    27. Leo Egghe, 2008. "Mathematical theory of the h‐ and g‐index in case of fractional counting of authorship," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(10), pages 1608-1616, August.
    28. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile‐based bibliometric indicators," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    29. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    2. Caroline S. Wagner & Lin Zhang & Loet Leydesdorff, 2022. "A discussion of measuring the top-1% most-highly cited publications: quality and impact of Chinese papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1825-1839, April.
    3. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    4. Pech, Gerson & Delgado, Catarina, 2021. "Screening the most highly cited papers in longitudinal bibliometric studies and systematic literature reviews of a research field or journal: Widespread used metrics vs a percentile citation-based app," Journal of Informetrics, Elsevier, vol. 15(3).
    5. Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
    6. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    2. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    3. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    4. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    5. Torger Möller & Marion Schmidt & Stefan Hornbostel, 2016. "Assessing the effects of the German Excellence Initiative with bibliometric methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2217-2239, December.
    6. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    7. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    8. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    9. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    10. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    11. Pedro Albarrán & Antonio Perianes-Rodríguez & Javier Ruiz-Castillo, 2015. "Differences in citation impact across countries," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(3), pages 512-525, March.
    12. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    13. Lutz Bornmann & Werner Marx & Andreas Barth, 2013. "The Normalization of Citation Counts Based on Classification Systems," Publications, MDPI, vol. 1(2), pages 1-9, August.
    14. Bouyssou, Denis & Marchant, Thierry, 2014. "An axiomatic approach to bibliometric rankings and indices," Journal of Informetrics, Elsevier, vol. 8(3), pages 449-477.
    15. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    16. Andersen, Jens Peter, 2017. "An empirical and theoretical critique of the Euclidean index," Journal of Informetrics, Elsevier, vol. 11(2), pages 455-465.
    17. Juan A Crespo & Ignacio Ortuño-Ortín & Javier Ruiz-Castillo, 2012. "The Citation Merit of Scientific Publications," PLOS ONE, Public Library of Science, vol. 7(11), pages 1-9, November.
    18. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    19. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    20. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:119:y:2019:i:3:d:10.1007_s11192-019-03099-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.