IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v12y2018i2p436-447.html
   My bibliography  Save this article

Algorithmically generated subject categories based on citation relations: An empirical micro study using papers on overall water splitting

Author

Listed:
  • Haunschild, Robin
  • Schier, Hermann
  • Marx, Werner
  • Bornmann, Lutz

Abstract

One important reason for the use of field categorization in bibliometrics is the necessity to make citation impact of papers published in different scientific fields comparable with each other. Raw citations are normalized by using field-categorization schemes to achieve comparable citation scores. There are different approaches to field categorization available. They can be broadly classified as intellectual and algorithmic approaches. A paper-based algorithmically constructed classification system (ACCS) was proposed which is based on citation relations. Using a few ACCS field-specific clusters, we investigate the discriminatory power of the ACCS. The micro study focusses on the topic ‘overall water splitting’ and related topics. The first part of the study investigates intellectually whether the ACCS is able to identify papers on overall water splitting reliably and validly. Next, we compare the ACCS with (1) a paper-based intellectual (INSPEC) classification and (2) a journal-based intellectual classification (Web of Science, WoS, subject categories). In the last part of our case study, we compare the average number of citations in selected ACCS clusters (on overall water splitting and related topics) with the average citation count of publications in WoS subject categories related to these clusters. The results of this micro study question the discriminatory power of the ACCS. We recommend larger follow-up studies on broad datasets.

Suggested Citation

  • Haunschild, Robin & Schier, Hermann & Marx, Werner & Bornmann, Lutz, 2018. "Algorithmically generated subject categories based on citation relations: An empirical micro study using papers on overall water splitting," Journal of Informetrics, Elsevier, vol. 12(2), pages 436-447.
  • Handle: RePEc:eee:infome:v:12:y:2018:i:2:p:436-447
    DOI: 10.1016/j.joi.2018.03.004
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115771730278X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2018.03.004?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Wang, Qi & Waltman, Ludo, 2016. "Large-scale analysis of the accuracy of the journal classification systems of Web of Science and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 347-364.
    2. Ludo Waltman & Nees Eck, 2013. "A smart local moving algorithm for large-scale modularity-based community detection," The European Physical Journal B: Condensed Matter and Complex Systems, Springer;EDP Sciences, vol. 86(11), pages 1-14, November.
    3. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    4. Richard Klavans & Kevin W. Boyack, 2017. "Which Type of Citation Analysis Generates the Most Accurate Taxonomy of Scientific and Technical Knowledge?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(4), pages 984-998, April.
    5. Loet Leydesdorff & Tobias Opthof, 2013. "Citation analysis with medical subject Headings (MeSH) using the Web of Knowledge: A new routine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(5), pages 1076-1080, May.
    6. Ismael Rafols & Loet Leydesdorff, 2009. "Content‐based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(9), pages 1823-1835, September.
    7. Werner Marx & Lutz Bornmann, 2015. "On the causes of subject-specific citation rates in Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1823-1827, February.
    8. Lutz Bornmann & Werner Marx & Andreas Barth, 2013. "The Normalization of Citation Counts Based on Classification Systems," Publications, MDPI, vol. 1(2), pages 1-9, August.
    9. Ludo Waltman & Nees Jan Eck, 2013. "Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 699-716, September.
    10. Lutz Bornmann & Hans‐Dieter Daniel, 2008. "Selecting manuscripts for a high‐impact journal through peer review: A citation analysis of communications that were accepted by Angewandte Chemie International Edition, or rejected but published else," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1841-1852, September.
    11. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    12. Sjögårde, Peter & Ahlgren, Per, 2018. "Granularity of algorithmically constructed publication-level classifications of research publications: Identification of topics," Journal of Informetrics, Elsevier, vol. 12(1), pages 133-152.
    13. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    14. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    15. Michael E. Reichenheim, 2001. "Sample size for the kappa-statistic of interrater agreement," Stata Technical Bulletin, StataCorp LP, vol. 10(58).
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Matthias Held & Grit Laudel & Jochen Gläser, 2021. "Challenges to the validity of topic reconstruction," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 4511-4536, May.
    2. Bornmann, Lutz & Haunschild, Robin, 2022. "Empirical analysis of recent temporal dynamics of research fields: Annual publications in chemistry and related areas as an example," Journal of Informetrics, Elsevier, vol. 16(2).
    3. Lutz Bornmann & K. Brad Wray & Robin Haunschild, 2020. "Citation concept analysis (CCA): a new form of citation analysis revealing the usefulness of concepts for other researchers illustrated by exemplary case studies including classic books by Thomas S. K," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1051-1074, February.
    4. Lin Zhang & Beibei Sun & Fei Shu & Ying Huang, 2022. "Comparing paper level classifications across different methods and systems: an investigation of Nature publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7633-7651, December.
    5. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    6. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    7. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    2. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    3. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    4. Haunschild, Robin & Daniels, Angela D. & Bornmann, Lutz, 2022. "Scores of a specific field-normalized indicator calculated with different approaches of field-categorization: Are the scores different or similar?," Journal of Informetrics, Elsevier, vol. 16(1).
    5. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    6. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    7. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    8. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    9. Lin Zhang & Beibei Sun & Fei Shu & Ying Huang, 2022. "Comparing paper level classifications across different methods and systems: an investigation of Nature publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7633-7651, December.
    10. Bornmann, Lutz & Haunschild, Robin, 2022. "Empirical analysis of recent temporal dynamics of research fields: Annual publications in chemistry and related areas as an example," Journal of Informetrics, Elsevier, vol. 16(2).
    11. Gerson Pech & Catarina Delgado & Silvio Paolo Sorella, 2022. "Classifying papers into subfields using Abstracts, Titles, Keywords and KeyWords Plus through pattern detection and optimization procedures: An application in Physics," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(11), pages 1513-1528, November.
    12. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    13. Sjögårde, Peter & Ahlgren, Per, 2018. "Granularity of algorithmically constructed publication-level classifications of research publications: Identification of topics," Journal of Informetrics, Elsevier, vol. 12(1), pages 133-152.
    14. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    15. Lutz Bornmann & Robin Haunschild & Sven E. Hug, 2018. "Visualizing the context of citations referencing papers published by Eugene Garfield: a new type of keyword co-occurrence analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 427-437, February.
    16. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    17. Perianes-Rodriguez, Antonio & Ruiz-Castillo, Javier, 2017. "A comparison of the Web of Science and publication-level classification systems of science," Journal of Informetrics, Elsevier, vol. 11(1), pages 32-45.
    18. Yuanyuan Liu & Qiang Wu & Shijie Wu & Yong Gao, 2021. "Weighted citation based on ranking-related contribution: a new index for evaluating article impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(10), pages 8653-8672, October.
    19. Carusi, Chiara & Bianchi, Giuseppe, 2019. "Scientific community detection via bipartite scholar/journal graph co-clustering," Journal of Informetrics, Elsevier, vol. 13(1), pages 354-386.
    20. Roberto Camerani & Daniele Rotolo & Nicola Grassano, 2018. "Do Firms Publish? A Multi-Sectoral Analysis," SPRU Working Paper Series 2018-21, SPRU - Science Policy Research Unit, University of Sussex Business School.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:12:y:2018:i:2:p:436-447. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.