IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v109y2016i3d10.1007_s11192-016-2150-8.html
   My bibliography  Save this article

Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report

Author

Listed:
  • Loet Leydesdorff

    (University of Amsterdam)

  • Paul Wouters

    (Leiden University)

  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

Abstract

Bibliometric indicators such as journal impact factors, h-indices, and total citation counts are algorithmic artifacts that can be used in research evaluation and management. These artifacts have no meaning by themselves, but receive their meaning from attributions in institutional practices. We distinguish four main stakeholders in these practices: (1) producers of bibliometric data and indicators; (2) bibliometricians who develop and test indicators; (3) research managers who apply the indicators; and (4) the scientists being evaluated with potentially competing career interests. These different positions may lead to different and sometimes conflicting perspectives on the meaning and value of the indicators. The indicators can thus be considered as boundary objects which are socially constructed in translations among these perspectives. This paper proposes an analytical clarification by listing an informed set of (sometimes unsolved) problems in bibliometrics which can also shed light on the tension between simple but invalid indicators that are widely used (e.g., the h-index) and more sophisticated indicators that are not used or cannot be used in evaluation practices because they are not transparent for users, cannot be calculated, or are difficult to interpret.

Suggested Citation

  • Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
  • Handle: RePEc:spr:scient:v:109:y:2016:i:3:d:10.1007_s11192-016-2150-8
    DOI: 10.1007/s11192-016-2150-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-2150-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-2150-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, July.
    2. Zvi Griliches, 1998. "Productivity, R&D, and the Data Constraint," NBER Chapters, in: R&D and Productivity: The Econometric Evidence, pages 347-374, National Bureau of Economic Research, Inc.
    3. Vincent Larivière & Éric Archambault & Yves Gingras & Étienne Vignola‐Gagné, 2006. "The place of serials in referencing practices: Comparing natural sciences and engineering with social sciences and humanities," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(8), pages 997-1004, June.
    4. Andrew J. Oswald, 2007. "An Examination of the Reliability of Prestigious Scholarly Journals: Evidence and Implications for Decision‐Makers," Economica, London School of Economics and Political Science, vol. 74(293), pages 21-31, February.
    5. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    6. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    7. Anton J. Nederhof, 2006. "Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(1), pages 81-100, January.
    8. G. Kreft & E. Leeuw, 1988. "The See-Saw Effect: a multilevel problem?," Quality & Quantity: International Journal of Methodology, Springer, vol. 22(2), pages 127-137, June.
    9. Alexander I. Pudovkin & Eugene Garfield, 2002. "Algorithmic procedure for finding semantically related journals," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 53(13), pages 1113-1119, November.
    10. Christoph Neuhaus & Hans-Dieter Daniel, 2009. "A new reference standard for citation analysis in chemistry and related fields based on the sections of Chemical Abstracts," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(2), pages 219-229, February.
    11. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    12. Daniele Rotolo & Loet Leydesdorff, 2015. "Matching Medline/PubMed data with Web of Science: A routine in R language," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 2155-2159, October.
    13. Anne-Wil Harzing, 2014. "A longitudinal study of Google Scholar coverage between 2012 and 2013," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 565-575, January.
    14. Loet Leydesdorff & Tobias Opthof, 2013. "Citation analysis with medical subject Headings (MeSH) using the Web of Knowledge: A new routine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(5), pages 1076-1080, May.
    15. Loet Leydesdorff & Lutz Bornmann, 2011. "Integrated impact indicators compared with impact factors: An alternative research design with policy implications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(11), pages 2133-2146, November.
    16. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    17. Ismael Rafols & Loet Leydesdorff, 2009. "Content‐based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(9), pages 1823-1835, September.
    18. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    19. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    20. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    21. Werner Marx, 2011. "Special features of historical papers from the viewpoint of bibliometrics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(3), pages 433-439, March.
    22. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    23. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    24. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.
    25. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    26. Herbert A. Simon, 2002. "Near decomposability and the speed of evolution," Industrial and Corporate Change, Oxford University Press and the Associazione ICC, vol. 11(3), pages 587-599, June.
    27. Loet Leydesdorff & Tobias Opthof, 2011. "Scopus' SNIP indicator: Reply to Moed," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(1), pages 214-215, January.
    28. Michel Zitt & Suzy Ramanana-Rahary & Elise Bassecoulard, 2005. "Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 373-401, April.
    29. Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
    30. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S., 2013. "Some modifications to the SNIP journal impact indicator," Journal of Informetrics, Elsevier, vol. 7(2), pages 272-285.
    31. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    32. Werner Marx, 2011. "Special features of historical papers from the viewpoint of bibliometrics," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(3), pages 433-439, March.
    33. John P A Ioannidis & Kevin Boyack & Paul F Wouters, 2016. "Citation Metrics: A Primer on How (Not) to Normalize," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-7, September.
    34. Moed, Henk F., 2010. "Measuring contextual citation impact of scientific journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 265-277.
    35. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    36. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h-index," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    37. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    38. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    39. Lutz Bornmann & Andreas Thor & Werner Marx & Hermann Schier, 2016. "The application of bibliometrics to research evaluation in the humanities and social sciences: An exploratory study using normalized Google Scholar data for the publications of a research institute," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2778-2789, November.
    40. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    41. Susanne E. Baumgartner & Loet Leydesdorff, 2014. "Group-based trajectory modeling (GBTM) of citations in scholarly literature: Dynamic qualities of “transient” and “sticky knowledge claims”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(4), pages 797-811, April.
    42. Rafols, Ismael & Leydesdorff, Loet & O’Hare, Alice & Nightingale, Paul & Stirling, Andy, 2012. "How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management," Research Policy, Elsevier, vol. 41(7), pages 1262-1282.
    43. Joost Kosten, 2016. "A classification of the use of research indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 457-464, July.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    3. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    4. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    5. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    6. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    7. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    8. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    9. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    10. Lutz Bornmann & Loet Leydesdorff, 2018. "Count highly-cited papers instead of papers with h citations: use normalized citation counts and compare “like with like”!," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1119-1123, May.
    11. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    12. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    13. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    14. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.
    15. Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
    16. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    17. Liwei Cai & Jiahao Tian & Jiaying Liu & Xiaomei Bai & Ivan Lee & Xiangjie Kong & Feng Xia, 2019. "Scholarly impact assessment: a survey of citation weighting solutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 453-478, February.
    18. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    19. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    20. Zhesi Shen & Liying Yang & Zengru Di & Jinshan Wu, 2019. "Large enough sample size to rank two groups of data reliably according to their means," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 653-671, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:109:y:2016:i:3:d:10.1007_s11192-016-2150-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.