IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v129y2024i2d10.1007_s11192-023-04901-4.html
   My bibliography  Save this article

The accuracy of field classifications for journals in Scopus

Author

Listed:
  • Mike Thelwall

    (University of Sheffield)

  • Stephen Pinfield

    (University of Sheffield)

Abstract

Journal field classifications in Scopus are used for citation-based indicators and by authors choosing appropriate journals to submit to. Whilst prior research has found that Scopus categories are occasionally misleading, it is not known how this varies for different journal types. In response, we assessed whether specialist, cross-field and general academic journals sometimes have publication practices that do not match their Scopus classifications. For this, we compared the Scopus narrow fields of journals with the fields that best fit their articles’ titles and abstracts. We also conducted qualitative follow-up to distinguish between Scopus classification errors and misleading journal aims. The results show sharp field differences in the extent to which both cross-field and apparently specialist journals publish articles that match their Scopus narrow fields, and the same for general journals. The results also suggest that a few journals have titles and aims that do not match their contents well, and that some large topics spread themselves across many relevant fields. Thus, the likelihood that a journal’s Scopus narrow fields reflect its contents varies substantially by field (although without systematic field trends) and some cross-field topics seem to cause difficulties in appropriately classifying relevant journals. These issues undermine citation-based indicators that rely on journal-level classification and may confuse scholars seeking publishing venues.

Suggested Citation

  • Mike Thelwall & Stephen Pinfield, 2024. "The accuracy of field classifications for journals in Scopus," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(2), pages 1097-1117, February.
  • Handle: RePEc:spr:scient:v:129:y:2024:i:2:d:10.1007_s11192-023-04901-4
    DOI: 10.1007/s11192-023-04901-4
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-023-04901-4
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-023-04901-4?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Wang, Qi & Waltman, Ludo, 2016. "Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 347-364.
    2. Richard Klavans & Kevin W. Boyack, 2017. "Which Type of Citation Analysis Generates the Most Accurate Taxonomy of Scientific and Technical Knowledge?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(4), pages 984-998, April.
    3. Antonio J. Gómez-Núñez & Benjamín Vargas-Quesada & Félix Moya-Anegón, 2016. "Updating the SCImago journal and country rank classification: A new approach using Ward's clustering and alternative combination of citation measures," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(1), pages 178-190, January.
    4. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Correction to: Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 907-908, January.
    5. repec:plo:pone00:0039464 is not listed on IDEAS
    6. Jing Zhang & Xiaomin Liu & Lili Wu, 2016. "The study of subject-classification based on journal coupling and expert subject-classification system," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1149-1170, June.
    7. Kevin W Boyack & David Newman & Russell J Duhon & Richard Klavans & Michael Patek & Joseph R Biberstine & Bob Schijvenaars & André Skupin & Nianli Ma & Katy Börner, 2011. "Clustering More than Two Million Biomedical Publications: Comparing the Accuracies of Nine Text-Based Similarity Approaches," PLOS ONE, Public Library of Science, vol. 6(3), pages 1-11, March.
    8. Alberto Martín-Martín & Mike Thelwall & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2021. "Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 871-906, January.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Daniel Icaza Alvarez & Fernando González-Ladrón-de-Guevara & Jorge Rojas Espinoza & David Borge-Diez & Santiago Pulla Galindo & Carlos Flores-Vázquez, 2025. "The Evolution of AI Applications in the Energy System Transition: A Bibliometric Analysis of Research Development, the Current State and Future Challenges," Energies, MDPI, vol. 18(6), pages 1-31, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    2. Fernando Morante-Carballo & Néstor Montalván-Burbano & Maribel Aguilar-Aguilar & Paúl Carrión-Mero, 2022. "A Bibliometric Analysis of the Scientific Research on Artisanal and Small-Scale Mining," IJERPH, MDPI, vol. 19(13), pages 1-29, July.
    3. Lin Zhang & Beibei Sun & Fei Shu & Ying Huang, 2022. "Comparing paper level classifications across different methods and systems: an investigation of Nature publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7633-7651, December.
    4. Michael Gusenbauer, 2022. "Search where you will find most: Comparing the disciplinary coverage of 56 bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2683-2745, May.
    5. Shir Aviv-Reuven & Ariel Rosenfeld, 2023. "A logical set theory approach to journal subject classification analysis: intra-system irregularities and inter-system discrepancies in Web of Science and Scopus," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 157-175, January.
    6. Sitaram Devarakonda & Dmitriy Korobskiy & Tandy Warnow & George Chacko, 2020. "Viewing computer science through citation analysis: Salton and Bergmark Redux," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(1), pages 271-287, October.
    7. Ramón A. Feenstra & Emilio Delgado López-Cózar, 2022. "Philosophers’ appraisals of bibliometric indicators and their use in evaluation: from recognition to knee-jerk rejection," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 2085-2103, April.
    8. Mehdi Toloo & Rouhollah Khodabandelou & Amar Oukil, 2022. "A Comprehensive Bibliometric Analysis of Fractional Programming (1965–2020)," Mathematics, MDPI, vol. 10(11), pages 1-21, May.
    9. Paul Donner, 2021. "Validation of the Astro dataset clustering solutions with external data," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1619-1645, February.
    10. Dušan Nikolić & Dragan Ivanović & Lidija Ivanović, 2024. "An open-source tool for merging data from multiple citation databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4573-4595, July.
    11. Basheer Kalash & Sarah Guillou & Lionel Nesta & Michele Pezzoni, 2024. "Does Lab Funding Matter for the Technological Application of Scientific Research? An Empirical Analysis of French Labs," Annals of Economics and Statistics, GENES, issue 153, pages 39-76.
    12. Anting Wang & Mohd Nizam Osman & Megat Al-Imran Yasin & Nurul Ain Mohd Hasan & Ying Cui, 2024. "Tracing Evolution and Communication Dynamics in Chinese Independent Documentary Films (2012-2022): A Systematic Review of Genre, Censorship, Culture, and Distribution," Studies in Media and Communication, Redfame publishing, vol. 12(1), pages 368-381, March.
    13. Elena Andriollo & Alberto Caimo & Laura Secco & Elena Pisani, 2021. "Collaborations in Environmental Initiatives for an Effective “Adaptive Governance” of Social–Ecological Systems: What Existing Literature Suggests," Sustainability, MDPI, vol. 13(15), pages 1-29, July.
    14. repec:dar:wpaper:132320 is not listed on IDEAS
    15. Ballester, Omar & Penner, Orion, 2022. "Robustness, replicability and scalability in topic modelling," Journal of Informetrics, Elsevier, vol. 16(1).
    16. Enrique Orduña-Malea & Núria Bautista-Puig, 2024. "Research assessment under debate: disentangling the interest around the DORA declaration on Twitter," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 537-559, January.
    17. Kelly Gerakoudi-Ventouri, 2022. "Review of studies of blockchain technology effects on the shipping industry," Journal of Shipping and Trade, Springer, vol. 7(1), pages 1-18, December.
    18. Safoora Pitsi & Jon Billsberry & Mary Barrett, 2024. "A Bibliometric Review of Research on Intelligence in Leadership Studies," FIIB Business Review, , vol. 13(5), pages 528-541, October.
    19. Yang Ding & Fernando Moreira, 2025. "Funding and productivity: Does winning grants affect the scientific productivity of recipients? Evidence from the social sciences and economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(3), pages 1831-1870, March.
    20. Gerson Pech & Catarina Delgado & Silvio Paolo Sorella, 2022. "Classifying papers into subfields using Abstracts, Titles, Keywords and KeyWords Plus through pattern detection and optimization procedures: An application in Physics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(11), pages 1513-1528, November.
    21. DIODATO Dario, 2024. "Handbook of Economic Complexity for Policy," JRC Research Reports JRC138666, Joint Research Centre.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:129:y:2024:i:2:d:10.1007_s11192-023-04901-4. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.