IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v127y2022i4d10.1007_s11192-022-04265-1.html
   My bibliography  Save this article

Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection

Author

Listed:
  • Ramón A. Feenstra

    (Universitat Jaume I de Castelló)

  • Emilio Delgado López-Cózar

    (Facultad de Comunicación y Documentación. Universidad de Granada)

Abstract

The knowledge and stance of researchers regarding bibliometric indicators is a field of study that has gained weight in recent decades. In this paper we address this issue for the little explored areas of philosophy and ethics, and applied to a context, in this case Spain, where bibliometric indicators are widely used in evaluation processes. The study combines data from a self-administered questionnaire completed by 201 researchers and from 14 in-depth interviews with researchers selected according to their affiliation, professional category, gender and area of knowledge. The survey data suggest that researchers do not consider bibliometric indicators a preferred criterion of quality, while there is a fairly high self-perception of awareness of a number of indicators. The qualitative data points to a generalised perception of a certain rejection of the specific use of indicators, with four main positions being observed: (1) disqualification of the logic of metrics, (2) scepticism about the possibility of assessing quality with quantitative methods, (3) complaints about the incorporation of methods that are considered to belong to other disciplines, and (4) criticism of the consequences that this generates in the discipline of philosophy.

Suggested Citation

  • Ramón A. Feenstra & Emilio Delgado López-Cózar, 2022. "Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2085-2103, April.
  • Handle: RePEc:spr:scient:v:127:y:2022:i:4:d:10.1007_s11192-022-04265-1
    DOI: 10.1007/s11192-022-04265-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-022-04265-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-022-04265-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Kaare Aagaard & Carter Bloch & Jesper W. Schneider, 2015. "Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator," Research Evaluation, Oxford University Press, vol. 24(2), pages 106-117.
    2. Aksnes, Dag W. & Rip, Arie, 2009. "Researchers' perceptions of citations," Research Policy, Elsevier, vol. 38(6), pages 895-905, July.
    3. Anton J. Nederhof, 2006. "Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(1), pages 81-100, January.
    4. Gemma E. Derrick & Vincenzo Pavone, 2013. "Democratising research evaluation: Achieving greater public engagement with bibliometrics-informed peer review," Science and Public Policy, Oxford University Press, vol. 40(5), pages 563-575, April.
    5. Björn Hammarfelt & Gaby Haddow, 2018. "Conflicting measures and values: How humanities scholars in Australia and Sweden use and react to bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(7), pages 924-935, July.
    6. Linda Butler, 2003. "Modifying publication practices in response to funding formulas," Research Evaluation, Oxford University Press, vol. 12(1), pages 39-46, April.
    7. Linda Butler, 2007. "Assessing university research: A plea for a balanced approach," Science and Public Policy, Oxford University Press, vol. 34(8), pages 565-574, October.
    8. Carmen Osuna & Laura Cruz-Castro & Luis Sanz-Menéndez, 2011. "Overturning some assumptions about the effects of evaluation systems on publication performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(3), pages 575-592, March.
    9. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Correction to: Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 907-908, January.
    10. Éric Archambault & Étienne Vignola-Gagné & Grégoire Côté & Vincent Larivière & Yves Gingrasb, 2006. "Benchmarking scientific output in the social sciences and humanities: The limits of existing databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 68(3), pages 329-342, September.
    11. Manjula Wijewickrema, 2021. "Authors’ perception on abstracting and indexing databases in different subject domains," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3063-3089, April.
    12. Andrei V. Grinëv & Daria S. Bylieva & Victoria V. Lobatyuk, 2021. "Russian University Teachers’ Perceptions of Scientometrics," Publications, MDPI, vol. 9(2), pages 1-16, May.
    13. Carolina Cañibano & Immaculada Vilardell & Carmen Corona & Carlos Benito-Amat, 2018. "The evaluation of research excellence and the dynamics of knowledge production in the humanities: The case of history in Spain," Science and Public Policy, Oxford University Press, vol. 45(6), pages 775-789.
    14. Htet Htet Aung & Han Zheng & Mojisola Erdt & Ashley Sara Aw & Sei‐Ching Joanna Sin & Yin‐Leng Theng, 2019. "Investigating familiarity and usage of traditional metrics and altmetrics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(8), pages 872-887, August.
    15. Diana Hicks, 1999. "The difficulty of achieving full coverage of international social science literature and the bibliometric consequences," Scientometrics, Springer;Akadémiai Kiadó, vol. 44(2), pages 193-215, February.
    16. Björn Hammarfelt & Alexander D. Rushforth, 2017. "Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation," Research Evaluation, Oxford University Press, vol. 26(3), pages 169-180.
    17. Gualberto Buela-Casal & Izabela Zych, 2012. "What do the scientists think about the impact factor?," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 281-292, August.
    18. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    19. Steffen Lemke & Athanasios Mazarakis & Isabella Peters, 2021. "Conjoint analysis of researchers' hidden preferences for bibliometrics, altmetrics, and usage metrics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 72(6), pages 777-792, June.
    20. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 871-906, January.
    21. Gaby Haddow & Björn Hammarfelt, 2019. "Quality, impact, and quantification: Indicators and metrics use by social scientists," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 70(1), pages 16-26, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eugenio Petrovich, 2022. "Bibliometrics in Press. Representations and uses of bibliometric indicators in the Italian daily newspapers," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2195-2233, May.
    2. Pantea Kamrani & Isabelle Dorsch & Wolfgang G. Stock, 2021. "Do researchers know what the h-index is? And how do they estimate its importance?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5489-5508, July.
    3. Sven Helmer & David B. Blumenthal & Kathrin Paschen, 2020. "What is meaningful research and how should we measure it?," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 153-169, October.
    4. Steffen Lemke & Athanasios Mazarakis & Isabella Peters, 2021. "Conjoint analysis of researchers' hidden preferences for bibliometrics, altmetrics, and usage metrics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 72(6), pages 777-792, June.
    5. Christian Schneijderberg & Nicolai Götze & Lars Müller, 2022. "A study of 25 years of publication outputs in the German academic profession," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 1-28, January.
    6. Csomós, György, 2020. "Introducing recalibrated academic performance indicators in the evaluation of individuals’ research performance: A case study from Eastern Europe," Journal of Informetrics, Elsevier, vol. 14(4).
    7. Yang, Siluo & Zheng, Mengxue & Yu, Yonghao & Wolfram, Dietmar, 2021. "Are Altmetric.com scores effective for research impact evaluation in the social sciences and humanities?," Journal of Informetrics, Elsevier, vol. 15(1).
    8. Sándor Soós & Zsófia Vida & András Schubert, 2018. "Long-term trends in the multidisciplinarity of some typical natural and social sciences, and its implications on the SSH versus STM distinction," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(3), pages 795-822, March.
    9. Zoltán Krajcsák, 2021. "Researcher Performance in Scopus Articles ( RPSA ) as a New Scientometric Model of Scientific Output: Tested in Business Area of V4 Countries," Publications, MDPI, vol. 9(4), pages 1-23, October.
    10. Andrea Mervar & Maja Jokić, 2022. "Core-periphery nexus in the EU social sciences: bibliometric perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5793-5817, October.
    11. Emanuel Kulczycki & Władysław Marek Kolasa & Krystian Szadkowski, 2021. "Marx, Engels, Lenin, and Stalin as highly cited researchers? Historical bibliometrics study," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8683-8700, October.
    12. van den Besselaar, Peter & Heyman, Ulf & Sandström, Ulf, 2017. "Perverse effects of output-based research funding? Butler’s Australian case revisited," Journal of Informetrics, Elsevier, vol. 11(3), pages 905-918.
    13. Ekaterina L. Dyachenko, 2014. "Internationalization of academic journals: Is there still a gap between social and natural sciences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 241-255, October.
    14. Linda Sīle & Raf Vanderstraeten, 2019. "Measuring changes in publication patterns in a context of performance-based research funding systems: the case of educational research in the University of Gothenburg (2005–2014)," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 71-91, January.
    15. Andrea Bonaccorsi & Cinzia Daraio & Stefano Fantoni & Viola Folli & Marco Leonetti & Giancarlo Ruocco, 2017. "Do social sciences and humanities behave like life and hard sciences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 607-653, July.
    16. Alessandro Margherita & Gianluca Elia & Claudio Petti, 2022. "What Is Quality in Research? Building a Framework of Design, Process and Impact Attributes and Evaluation Perspectives," Sustainability, MDPI, vol. 14(5), pages 1-18, March.
    17. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    18. Torger Möller & Marion Schmidt & Stefan Hornbostel, 2016. "Assessing the effects of the German Excellence Initiative with bibliometric methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2217-2239, December.
    19. Andrea Diem & Stefan C. Wolter, 2011. "The Use of Bibliometrics to Measure Research Performance in Education Sciences," Economics of Education Working Paper Series 0066, University of Zurich, Department of Business Administration (IBW), revised May 2013.
    20. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:127:y:2022:i:4:d:10.1007_s11192-022-04265-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.