IDEAS home Printed from https://ideas.repec.org/a/bla/jinfst/v74y2023i8p941-953.html
   My bibliography  Save this article

In which fields are citations indicators of research quality?

Author

Listed:
  • Mike Thelwall
  • Kayvan Kousha
  • Emma Stuart
  • Meiko Makita
  • Mahshid Abdoli
  • Paul Wilson
  • Jonathan Levitt

Abstract

Citation counts are widely used as indicators of research quality to support or replace human peer review and for lists of top cited papers, researchers, and institutions. Nevertheless, the relationship between citations and research quality is poorly evidenced. We report the first large‐scale science‐wide academic evaluation of the relationship between research quality and citations (field normalized citation counts), correlating them for 87,739 journal articles in 34 field‐based UK Units of Assessment (UoA). The two correlate positively in all academic fields, from very weak (0.1) to strong (0.5), reflecting broadly linear relationships in all fields. We give the first evidence that the correlations are positive even across the arts and humanities. The patterns are similar for the field classification schemes of Scopus and Dimensions.ai, although varying for some individual subjects and therefore more uncertain for these. We also show for the first time that no field has a citation threshold beyond which all articles are excellent quality, so lists of top cited articles are not pure collections of excellence, and neither is any top citation percentile indicator. Thus, while appropriately field normalized citations associate positively with research quality in all fields, they never perfectly reflect it, even at high values.

Suggested Citation

  • Mike Thelwall & Kayvan Kousha & Emma Stuart & Meiko Makita & Mahshid Abdoli & Paul Wilson & Jonathan Levitt, 2023. "In which fields are citations indicators of research quality?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(8), pages 941-953, August.
  • Handle: RePEc:bla:jinfst:v:74:y:2023:i:8:p:941-953
    DOI: 10.1002/asi.24767
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/asi.24767
    Download Restriction: no

    File URL: https://libkey.io/10.1002/asi.24767?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Linda Butler & Ian McAllister, 2009. "Metrics or Peer Review? Evaluating the 2001 UK Research Assessment Exercise in Political Science," Political Studies Review, Political Studies Association, vol. 7(1), pages 3-17, January.
    2. Ludo Waltman & Clara Calero‐Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan van Eck & Thed N. van Leeuwen & Anthony F.J. van Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    3. Mike Thelwall & Nabeil Maflahi, 2020. "Academic collaboration rates and citation associations vary substantially between countries and fields," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 71(8), pages 968-978, August.
    4. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom-cited influences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    5. Curtis P. Berlinguette & Yet-Ming Chiang & Jeremy N. Munday & Thomas Schenkel & David K. Fork & Ross Koningstein & Matthew D. Trevithick, 2019. "Revisiting the cold case of cold fusion," Nature, Nature, vol. 570(7759), pages 45-51, June.
    6. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    7. Diana Hicks & Paul Wouters & Ludo Waltman & Sarah de Rijcke & Ismael Rafols, 2015. "Bibliometrics: The Leiden Manifesto for research metrics," Nature, Nature, vol. 520(7548), pages 429-431, April.
    8. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2011. "National research assessment exercises: a comparison of peer review and bibliometrics rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 929-941, December.
    9. Thelwall, Mike, 2017. "Three practical field normalised alternative indicator formulae for research evaluation," Journal of Informetrics, Elsevier, vol. 11(1), pages 128-151.
    10. Alberto Baccini & Giuseppe De Nicolao, 2016. "Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1651-1671, September.
    11. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2016. "Refrain from adopting the combination of citation and journal metrics to grade publications, as used in the Italian national research assessment exercise (VQR 2011–2014)," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2053-2065, December.
    12. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    13. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    14. Dag W Aksnes & Randi Elisabeth Taxt, 2004. "Peer reviews and bibliometric indicators: a comparative study at a Norwegian university," Research Evaluation, Oxford University Press, vol. 13(1), pages 33-41, April.
    15. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    2. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    3. Robert A. Buckle & John Creedy, 2022. "Methods to evaluate institutional responses to performance‐based research funding systems," Australian Economic Papers, Wiley Blackwell, vol. 61(3), pages 615-634, September.
    4. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    5. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    6. Giovanni Abramo & Tindaro Cicero & Ciriaco Andrea D’Angelo, 2013. "National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 311-324, April.
    7. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    8. Lutz Bornmann, 2020. "Bibliometrics-based decision tree (BBDT) for deciding whether two universities in the Leiden ranking differ substantially in their performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1255-1258, February.
    9. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    10. Robert A. Buckle & John Creedy & Ashley Ball, 2021. "Fifteen Years of a PBRFS in New Zealand: Incentives and Outcomes," Australian Economic Review, The University of Melbourne, Melbourne Institute of Applied Economic and Social Research, vol. 54(2), pages 208-230, June.
    11. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2011. "Assessing the varying level of impact measurement accuracy as a function of the citation window length," Journal of Informetrics, Elsevier, vol. 5(4), pages 659-667.
    12. Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
    13. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2011. "National research assessment exercises: a comparison of peer review and bibliometrics rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 929-941, December.
    14. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    15. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    16. Andersen, Jens Peter, 2017. "An empirical and theoretical critique of the Euclidean index," Journal of Informetrics, Elsevier, vol. 11(2), pages 455-465.
    17. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    18. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2012. "A sensitivity analysis of researchers’ productivity rankings to the time of citation observation," Journal of Informetrics, Elsevier, vol. 6(2), pages 192-201.
    19. Robin Haunschild & Lutz Bornmann, 2022. "Relevance of document types in the scores’ calculation of a specific field-normalized indicator: Are the scores strongly dependent on or nearly independent of the document type handling?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4419-4438, August.
    20. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jinfst:v:74:y:2023:i:8:p:941-953. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.asis.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.