IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0231735.html
   My bibliography  Save this article

Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019

Author

Listed:
  • Arlette Jappe

Abstract

Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe—first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have defined wide-spread practices. The framework of this investigation is Abbott’s theory of professions, and I argue that indicator-based research assessment constitutes a potential jurisdiction for both individual experts and expert organizations. This investigation was conducted using a search methodology that yielded 138 evaluation studies from 21 EU countries, covering the period 2005 to 2019. Structured content analysis revealed the following findings: (1) Bibliometric research assessment is most frequently performed in the Nordic countries, the Netherlands, Italy, and the United Kingdom. (2) The Web of Science (WoS) is the dominant database used for public research assessment in Europe. (3) Expert organizations invest in the improvement of WoS citation data, and set technical standards with regards to data quality. (4) Citation impact is most frequently assessed with reference to international scientific fields. (5) The WoS classification of science fields retained its function as a de facto reference standard for research performance assessment. A detailed comparison of assessment practices between five dedicated organizations and other individual bibliometric experts suggests that corporate ownership and limited access to the most widely used citation databases have had a restraining effect on the development and diffusion of professional bibliometric methods during this period.

Suggested Citation

  • Arlette Jappe, 2020. "Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005–2019," PLOS ONE, Public Library of Science, vol. 15(4), pages 1-23, April.
  • Handle: RePEc:plo:pone00:0231735
    DOI: 10.1371/journal.pone.0231735
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0231735
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0231735&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0231735?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    2. Barbara Good, 2012. "Assessing the effects of a collaborative research funding scheme: An approach combining meta-evaluation and evaluation synthesis," Research Evaluation, Oxford University Press, vol. 21(5), pages 381-391, October.
    3. Rémi Barré, 2019. "Les indicateurs sont morts, vive les indicateurs! Towards a political economy of S&T indicators: A critical overview of the past 35 years," Research Evaluation, Oxford University Press, vol. 28(1), pages 2-6.
    4. Eleni Fragkiadaki & Georgios Evangelidis, 2014. "Review of the indirect citations paradigm: theory and practice of the assessment of papers, authors and journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 99(2), pages 261-288, May.
    5. Ludo Waltman & Nees Jan Eck, 2013. "Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 699-716, September.
    6. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    7. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2015. "Research quality evaluation: comparing citation counts considering bibliometric database errors," Quality & Quantity: International Journal of Methodology, Springer, vol. 49(1), pages 155-165, January.
    8. Henk F. Moed & Gali Halevi, 2015. "Multidimensional assessment of scholarly research impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 1988-2002, October.
    9. Linda Sīle & Janne Pölönen & Gunnar Sivertsen & Raf Guns & Tim C E Engels & Pavel Arefiev & Marta Dušková & Lotte Faurbæk & András Holl & Emanuel Kulczycki & Bojan Macan & Gustaf Nelhans & Michal Petr, 2018. "Comprehensiveness of national bibliographic databases for social sciences and humanities: Findings from a European survey," Research Evaluation, Oxford University Press, vol. 27(4), pages 310-322.
    10. Sarah de Rijcke & Paul F. Wouters & Alex D. Rushforth & Thomas P. Franssen & Björn Hammarfelt, 2016. "Evaluation practices and effects of indicator use—a literature review," Research Evaluation, Oxford University Press, vol. 25(2), pages 161-169.
    11. Yves Gingras & Mahdi Khelfaoui, 2019. "Do we need a book citation index for research evaluation?," Research Evaluation, Oxford University Press, vol. 28(4), pages 383-393.
    12. Wolfgang Glänzel & András Schubert, 2003. "A new classification scheme of science fields and subfields designed for scientometric evaluation purposes," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(3), pages 357-367, March.
    13. Katrin Milzow & Anke Reinhardt & Sten Söderberg & Klaus ZinöckerDeceased, 2019. "Understanding the use and usability of research evaluation studies1,2," Research Evaluation, Oxford University Press, vol. 28(1), pages 94-107.
    14. Loet Leydesdorff & Lutz Bornmann, 2016. "The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(3), pages 707-714, March.
    15. Steven Lam & Warren Dodd & Jane Whynot & Kelly Skinner, 2019. "How is gender being addressed in the international development evaluation literature? A meta-evaluation," Research Evaluation, Oxford University Press, vol. 28(2), pages 158-168.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Lin Zhang & Beibei Sun & Fei Shu & Ying Huang, 2022. "Comparing paper level classifications across different methods and systems: an investigation of Nature publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7633-7651, December.
    3. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    4. Ruiz-Castillo, Javier & Costas, Rodrigo, 2018. "Individual and field citation distributions in 29 broad scientific fields," Journal of Informetrics, Elsevier, vol. 12(3), pages 868-892.
    5. Li, Yunrong & Ruiz-Castillo, Javier, 2013. "The comparison of normalization procedures based on different classification systems," Journal of Informetrics, Elsevier, vol. 7(4), pages 945-958.
    6. Ricardo Arencibia-Jorge & Rosa Lidia Vega-Almeida & José Luis Jiménez-Andrade & Humberto Carrillo-Calvet, 2022. "Evolutionary stages and multidisciplinary nature of artificial intelligence research," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5139-5158, September.
    7. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    8. Leydesdorff, Loet & Bornmann, Lutz & Zhou, Ping, 2016. "Construction of a pragmatic base line for journal classifications and maps based on aggregated journal-journal citation relations," Journal of Informetrics, Elsevier, vol. 10(4), pages 902-918.
    9. Sjögårde, Peter & Ahlgren, Per, 2018. "Granularity of algorithmically constructed publication-level classifications of research publications: Identification of topics," Journal of Informetrics, Elsevier, vol. 12(1), pages 133-152.
    10. Shu, Fei & Julien, Charles-Antoine & Zhang, Lin & Qiu, Junping & Zhang, Jing & Larivière, Vincent, 2019. "Comparing journal and paper level classifications of science," Journal of Informetrics, Elsevier, vol. 13(1), pages 202-225.
    11. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    12. Loet Leydesdorff & Lutz Bornmann & Caroline S. Wagner, 2017. "Generating clustered journal maps: an automated system for hierarchical classification," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1601-1614, March.
    13. Fei Shu & Yue Ma & Junping Qiu & Vincent Larivière, 2020. "Classifications of science and their effects on bibliometric evaluations," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2727-2744, December.
    14. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    15. Jielan Ding & Per Ahlgren & Liying Yang & Ting Yue, 2018. "Disciplinary structures in Nature, Science and PNAS: journal and country levels," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 1817-1852, September.
    16. Michel Zitt, 2015. "Meso-level retrieval: IR-bibliometrics interplay and hybrid citation-words methods in scientific fields delineation," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2223-2245, March.
    17. Carusi, Chiara & Bianchi, Giuseppe, 2019. "Scientific community detection via bipartite scholar/journal graph co-clustering," Journal of Informetrics, Elsevier, vol. 13(1), pages 354-386.
    18. Roberto Camerani & Daniele Rotolo & Nicola Grassano, 2018. "Do Firms Publish? A Multi-Sectoral Analysis," SPRU Working Paper Series 2018-21, SPRU - Science Policy Research Unit, University of Sussex Business School.
    19. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    20. Staša Milojević, 2020. "Nature, Science, and PNAS: disciplinary profiles and impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(3), pages 1301-1315, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0231735. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.