IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v111y2017i3d10.1007_s11192-017-2366-2.html
   My bibliography  Save this article

Experimental evaluation of parameter settings in calculation of hybrid similarities: effects of first- and second-order similarity, edge cutting, and weighting factors

Author

Listed:
  • Fabian Meyer-Brötz

    (University of Ulm)

  • Edgar Schiebel

    (AIT Austrian Institute of Technology GmbH)

  • Leo Brecht

    (University of Ulm)

Abstract

The ongoing discussion in the bibliometric community about the best similarity measures has led to diverse insights. Although these insights are sometimes contradicting, there is one very consistent conclusion: Hybrid measures outperform the application of their singular components. While this initially answers the question as to what is the best similarity measure, it also raises issues which have been resolved in part for conventional similarity measures. Given this, in this study we investigate the impact of the right weighting factors, the appropriate level of edge cutting, the performance of first- in contrast to second-order similarities, and the interaction of these three parameters in the context of hybrid similarities. Building upon a dataset of over 8000 articles from the manufacturing engineering field and using different parameter settings we calculated over 100 similarity matrices. For each matrix we determined several cluster solutions of different resolution levels, ranging from 100 to 1000 clusters, and evaluated them quantitatively with the help of a textual coherence value based on the Jensen Shannon Divergence. We found that second-order hybrid similarity measures calculated with a weighting factor of 0.6 for the citation-based similarity and a reduction to only the strongest values yield the best clustering results. Furthermore, we found the assessed parameters to be highly interdependent, where for example hybrid first-order outperforms second-order when no edge cutting is applied. Given this, our results can serve the bibliometric community as a guideline for the appropriate application of hybrid measures.

Suggested Citation

  • Fabian Meyer-Brötz & Edgar Schiebel & Leo Brecht, 2017. "Experimental evaluation of parameter settings in calculation of hybrid similarities: effects of first- and second-order similarity, edge cutting, and weighting factors," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(3), pages 1307-1325, June.
  • Handle: RePEc:spr:scient:v:111:y:2017:i:3:d:10.1007_s11192-017-2366-2
    DOI: 10.1007/s11192-017-2366-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-017-2366-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-017-2366-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Edgar Schiebel, 2012. "Visualization of research fronts and knowledge bases by three-dimensional areal densities of bibliographically coupled publications and co-citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 557-566, May.
    2. Yongli Li & Guijie Zhang & Yuqiang Feng & Chong Wu, 2015. "An entropy-based social network community detecting method and its application to scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 1003-1017, January.
    3. Richard Klavans & Kevin W. Boyack, 2017. "Which Type of Citation Analysis Generates the Most Accurate Taxonomy of Scientific and Technical Knowledge?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(4), pages 984-998, April.
    4. Kevin W. Boyack & Richard Klavans, 2010. "Co‐citation analysis, bibliographic coupling, and direct citation: Which citation approach represents the research front most accurately?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(12), pages 2389-2404, December.
    5. Kevin W. Boyack & Richard Klavans, 2014. "Creation of a highly detailed, dynamic, global model and map of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(4), pages 670-685, April.
    6. Xiangfeng Meng & Xinhai Liu & YunHai Tong & Wolfgang Glänzel & Shaohua Tan, 2015. "Multi-view clustering with exemplars for scientific mapping," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 1527-1552, December.
    7. Bart Thijs & Edgar Schiebel & Wolfgang Glänzel, 2013. "Do second-order similarities provide added-value in a hybrid approach?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 667-677, September.
    8. Kurt Hornik & Christian Buchta & Achim Zeileis, 2009. "Open-source machine learning: R meets Weka," Computational Statistics, Springer, vol. 24(2), pages 225-232, May.
    9. Kevin W. Boyack & Richard Klavans, 2010. "Co-citation analysis, bibliographic coupling, and direct citation: Which citation approach represents the research front most accurately?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(12), pages 2389-2404, December.
    10. Cristian Colliander & Per Ahlgren, 2012. "Experimental comparison of first and second-order similarities in a scientometric context," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 675-685, February.
    11. Lawrence Hubert & Phipps Arabie, 1985. "Comparing partitions," Journal of Classification, Springer;The Classification Society, vol. 2(1), pages 193-218, December.
    12. Ahlgren, Per & Colliander, Cristian, 2009. "Document–document similarity approaches and science mapping: Experimental comparison of five approaches," Journal of Informetrics, Elsevier, vol. 3(1), pages 49-63.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dejian Yu & Wanru Wang & Shuai Zhang & Wenyu Zhang & Rongyu Liu, 2017. "Hybrid self-optimized clustering model based on citation links and textual features to detect research topics," PLOS ONE, Public Library of Science, vol. 12(10), pages 1-21, October.
    2. Klaus Kammerer & Manuel Göster & Manfred Reichert & Rüdiger Pryss, 2021. "Ambalytics: A Scalable and Distributed System Architecture Concept for Bibliometric Network Analyses," Future Internet, MDPI, vol. 13(8), pages 1-29, August.
    3. Guadalupe Palacios-Núñez & Gabriel Vélez-Cuartas & Juan D. Botero, 2018. "Developmental tendencies in the academic field of intellectual property through the identification of invisible colleges," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1561-1574, June.
    4. Jie Yu & Yaliu Li & Chenle Pan & Junwei Wang, 2021. "A Classification Method for Academic Resources Based on a Graph Attention Network," Future Internet, MDPI, vol. 13(3), pages 1-16, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sjögårde, Peter & Ahlgren, Per, 2018. "Granularity of algorithmically constructed publication-level classifications of research publications: Identification of topics," Journal of Informetrics, Elsevier, vol. 12(1), pages 133-152.
    2. Yun, Jinhyuk & Ahn, Sejung & Lee, June Young, 2020. "Return to basics: Clustering of scientific literature using structural information," Journal of Informetrics, Elsevier, vol. 14(4).
    3. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    4. Yun, Jinhyuk, 2022. "Generalization of bibliographic coupling and co-citation using the node split network," Journal of Informetrics, Elsevier, vol. 16(2).
    5. Shu, Fei & Julien, Charles-Antoine & Zhang, Lin & Qiu, Junping & Zhang, Jing & Larivière, Vincent, 2019. "Comparing journal and paper level classifications of science," Journal of Informetrics, Elsevier, vol. 13(1), pages 202-225.
    6. Carlos Olmeda-Gómez & Carlos Romá-Mateo & Maria-Antonia Ovalle-Perandones, 2019. "Overview of trends in global epigenetic research (2009–2017)," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1545-1574, June.
    7. Wolfgang Glänzel & Bart Thijs, 2017. "Using hybrid methods and ‘core documents’ for the representation of clusters and topics: the astronomy dataset," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(2), pages 1071-1087, May.
    8. Bart Thijs & Edgar Schiebel & Wolfgang Glänzel, 2013. "Do second-order similarities provide added-value in a hybrid approach?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 667-677, September.
    9. Serhat Burmaoglu & Ozcan Saritas, 2019. "An evolutionary analysis of the innovation policy domain: Is there a paradigm shift?," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 823-847, March.
    10. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    11. Shuo Xu & Liyuan Hao & Xin An & Hongshen Pang & Ting Li, 2020. "Review on emerging research topics with key-route main path analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 607-624, January.
    12. Paul Donner, 2021. "Validation of the Astro dataset clustering solutions with external data," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1619-1645, February.
    13. Liu, Yunmei & Yang, Liu & Chen, Min, 2021. "A new citation concept: Triangular citation in the literature," Journal of Informetrics, Elsevier, vol. 15(2).
    14. Li, Menghui & Yang, Liying & Zhang, Huina & Shen, Zhesi & Wu, Chensheng & Wu, Jinshan, 2017. "Do mathematicians, economists and biomedical scientists trace large topics more strongly than physicists?," Journal of Informetrics, Elsevier, vol. 11(2), pages 598-607.
    15. Michel Zitt, 2015. "Meso-level retrieval: IR-bibliometrics interplay and hybrid citation-words methods in scientific fields delineation," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2223-2245, March.
    16. Matthias Held & Grit Laudel & Jochen Gläser, 2021. "Challenges to the validity of topic reconstruction," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 4511-4536, May.
    17. Dejian Yu & Wanru Wang & Shuai Zhang & Wenyu Zhang & Rongyu Liu, 2017. "Hybrid self-optimized clustering model based on citation links and textual features to detect research topics," PLOS ONE, Public Library of Science, vol. 12(10), pages 1-21, October.
    18. Cristian Colliander & Per Ahlgren, 2012. "Experimental comparison of first and second-order similarities in a scientometric context," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 675-685, February.
    19. Baccini, Federica & Barabesi, Lucio & Baccini, Alberto & Khelfaoui, Mahdi & Gingras, Yves, 2022. "Similarity network fusion for scholarly journals," Journal of Informetrics, Elsevier, vol. 16(1).
    20. Mu-Hsuan Huang & Chia-Pin Chang, 2014. "Detecting research fronts in OLED field using bibliographic coupling with sliding window," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(3), pages 1721-1744, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:111:y:2017:i:3:d:10.1007_s11192-017-2366-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.