IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v119y2019i2d10.1007_s11192-019-03071-6.html
   My bibliography  Save this article

How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators

Author

Listed:
  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

  • Alexander Tekles

    (Administrative Headquarters of the Max Planck Society
    Ludwig-Maximilians-University Munich)

  • Loet Leydesdorff

    (University of Amsterdam)

Abstract

Recently, the integrated impact indicator (I3) was introduced where citations are weighted in accordance with the percentile rank class of each publication in a set of publications. I3 can also be used as a field-normalized indicator. Field-normalization is common practice in bibliometrics, especially when institutions and countries are compared. Publication and citation practices are so different among fields that citation impact is normalized for cross-field comparisons. In this study, we test the ability of the indicator to discriminate between quality levels of papers as defined by Faculty members at F1000Prime. F1000Prime is a post-publication peer review system for assessing papers in the biomedical area. Thus, we test the convergent validity of I3 (in this study, we test I3/N—the size-independent variant of I3 where I3 is divided by the number of papers) using assessments by peers as baseline and compare its validity with several other (field-normalized) indicators: the mean-normalized citation score, relative-citation ratio, citation score normalized by cited references, characteristic scores and scales, source-normalized citation score, citation percentile, and proportion of papers which belong to the x% most frequently cited papers (PPtop x%). The results show that the PPtop 1% indicator discriminates best among different quality levels. I3 performs similar as (slightly better than) most of the other field-normalized indicators. Thus, the results point out that the indicator could be a valuable alternative to other indicators in bibliometrics.

Suggested Citation

  • Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
  • Handle: RePEc:spr:scient:v:119:y:2019:i:2:d:10.1007_s11192-019-03071-6
    DOI: 10.1007/s11192-019-03071-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-019-03071-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-019-03071-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Caroline S. Wagner & Loet Leydesdorff, 2012. "An Integrated Impact Indicator: A new definition of 'Impact' with policy relevance," Research Evaluation, Oxford University Press, vol. 21(3), pages 183-188, July.
    2. Lutz Bornmann & Wolfgang Glänzel, 2018. "Which differences can be expected when two universities in the Leiden Ranking are compared? Some benchmarks for institutional research evaluations," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1101-1105, May.
    3. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, July.
    4. Loet Leydesdorff & Filippo Radicchi & Lutz Bornmann & Claudio Castellano & Wouter Nooy, 2013. "Field-normalized impact factors (IFs): A comparison of rescaling and fractionally counted IFs," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(11), pages 2299-2309, November.
    5. Moed, Henk F., 2010. "Measuring contextual citation impact of scientific journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 265-277.
    6. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    7. Lutz Bornmann, 2014. "How are excellent (highly cited) papers defined in bibliometrics? A quantitative analysis of the literature," Research Evaluation, Oxford University Press, vol. 23(2), pages 166-173.
    8. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    9. Loet Leydesdorff, 2006. "Can scientific journals be classified in terms of aggregated journal‐journal citation relations using the Journal Citation Reports?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(5), pages 601-613, March.
    10. Lundberg, Jonas, 2007. "Lifting the crown—citation z-score," Journal of Informetrics, Elsevier, vol. 1(2), pages 145-154.
    11. Loet Leydesdorff & Lutz Bornmann, 2011. "How fractional counting of citations affects the impact factor: Normalization in terms of differences in citation potentials among fields of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(2), pages 217-229, February.
    12. Ronald Rousseau, 2012. "Basic properties of both percentile rank scores and the I3 indicator," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 416-420, February.
    13. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    14. Loet Leydesdorff & Lutz Bornmann, 2011. "Integrated impact indicators compared with impact factors: An alternative research design with policy implications," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(11), pages 2133-2146, November.
    15. Opthof, Tobias & Leydesdorff, Loet, 2010. "Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance," Journal of Informetrics, Elsevier, vol. 4(3), pages 423-430.
    16. Ronald Rousseau, 2012. "Basic properties of both percentile rank scores and the I3 indicator," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(2), pages 416-420, February.
    17. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    18. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    19. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    20. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    21. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    22. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    23. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    24. Werner Marx & Lutz Bornmann, 2015. "On the causes of subject-specific citation rates in Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1823-1827, February.
    25. Richard Klavans & Kevin W. Boyack, 2017. "Which Type of Citation Analysis Generates the Most Accurate Taxonomy of Scientific and Technical Knowledge?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(4), pages 984-998, April.
    26. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    27. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile‐based bibliometric indicators," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    28. Wolfgang Glänzel & Bart Thijs & Koenraad Debackere, 2014. "The application of citation-based performance classes to the disciplinary and multidisciplinary assessment in national comparison and institutional research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 939-952, November.
    29. Michel Zitt & Henry Small, 2008. "Modifying the journal impact factor by fractional citation weighting: The audience factor," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1856-1860, September.
    30. Loet Leydesdorff & Lutz Bornmann, 2012. "Percentile ranks and the integrated impact indicator (I3)," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(9), pages 1901-1902, September.
    31. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    32. Lutz Bornmann & Werner Marx & Andreas Barth, 2013. "The Normalization of Citation Counts Based on Classification Systems," Publications, MDPI, vol. 1(2), pages 1-9, August.
    33. Ludo Waltman & Nees Jan Eck, 2013. "Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 699-716, September.
    34. Smolinsky, Lawrence, 2016. "Expected number of citations and the crown indicator," Journal of Informetrics, Elsevier, vol. 10(1), pages 43-47.
    35. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Pech, Gerson & Delgado, Catarina, 2021. "Screening the most highly cited papers in longitudinal bibliometric studies and systematic literature reviews of a research field or journal: Widespread used metrics vs a percentile citation-based app," Journal of Informetrics, Elsevier, vol. 15(3).
    2. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    3. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    3. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    4. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    5. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    6. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    7. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    8. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.
    9. Bornmann, Lutz & Leydesdorff, Loet & Wang, Jian, 2013. "Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P1," Journal of Informetrics, Elsevier, vol. 7(4), pages 933-944.
    10. Loet Leydesdorff, 2013. "An evaluation of impacts in “Nanoscience & nanotechnology”: steps towards standards for citation analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 35-55, January.
    11. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    12. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    13. Loet Leydesdorff, 2012. "Alternatives to the journal impact factor: I3 and the top-10% (or top-25%?) of the most-highly cited papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 355-365, August.
    14. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    15. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    16. Liwei Cai & Jiahao Tian & Jiaying Liu & Xiaomei Bai & Ivan Lee & Xiangjie Kong & Feng Xia, 2019. "Scholarly impact assessment: a survey of citation weighting solutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 453-478, February.
    17. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    18. Lutz Bornmann & Werner Marx & Andreas Barth, 2013. "The Normalization of Citation Counts Based on Classification Systems," Publications, MDPI, vol. 1(2), pages 1-9, August.
    19. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    20. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).

    More about this item

    Keywords

    Bibliometrics; I3; Field normalization; Citation analysis; Convergent validity;
    All these keywords.

    JEL classification:

    • I3 - Health, Education, and Welfare - - Welfare, Well-Being, and Poverty

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:119:y:2019:i:2:d:10.1007_s11192-019-03071-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.