IDEAS home Printed from https://ideas.repec.org/a/gam/jftint/v11y2019i9p202-d268805.html
   My bibliography  Save this article

Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus

Author

Listed:
  • Cristòfol Rovira

    (Department of Communication, Universitat Pompeu Fabra, 08002 Barcelona, Spain)

  • Lluís Codina

    (Department of Communication, Universitat Pompeu Fabra, 08002 Barcelona, Spain)

  • Frederic Guerrero-Solé

    (Department of Communication, Universitat Pompeu Fabra, 08002 Barcelona, Spain)

  • Carlos Lopezosa

    (Department of Communication, Universitat Pompeu Fabra, 08002 Barcelona, Spain)

Abstract

Search engine optimization (SEO) constitutes the set of methods designed to increase the visibility of, and the number of visits to, a web page by means of its ranking on the search engine results pages. Recently, SEO has also been applied to academic databases and search engines, in a trend that is in constant growth. This new approach, known as academic SEO (ASEO), has generated a field of study with considerable future growth potential due to the impact of open science. The study reported here forms part of this new field of analysis. The ranking of results is a key aspect in any information system since it determines the way in which these results are presented to the user. The aim of this study is to analyze and compare the relevance ranking algorithms employed by various academic platforms to identify the importance of citations received in their algorithms. Specifically, we analyze two search engines and two bibliographic databases: Google Scholar and Microsoft Academic, on the one hand, and Web of Science and Scopus, on the other. A reverse engineering methodology is employed based on the statistical analysis of Spearman’s correlation coefficients. The results indicate that the ranking algorithms used by Google Scholar and Microsoft are the two that are most heavily influenced by citations received. Indeed, citation counts are clearly the main SEO factor in these academic search engines. An unexpected finding is that, at certain points in time, Web of Science (WoS) used citations received as a key ranking factor, despite the fact that WoS support documents claim this factor does not intervene.

Suggested Citation

  • Cristòfol Rovira & Lluís Codina & Frederic Guerrero-Solé & Carlos Lopezosa, 2019. "Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus," Future Internet, MDPI, vol. 11(9), pages 1-21, September.
  • Handle: RePEc:gam:jftint:v:11:y:2019:i:9:p:202-:d:268805
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/1999-5903/11/9/202/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/1999-5903/11/9/202/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Mike Thelwall, 2018. "Does Microsoft Academic find early citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 325-334, January.
    2. Farhadi, Hadi & Salehi, Hadi & Md Yunus, Melor & Arezoo, Aghaei Chadegani & Farhadi, Maryam & Fooladi, Masood & Ale Ebrahim, Nader, 2012. "Does it Matter Which Citation Tool is Used to Compare the H-Index of a Group of Highly Cited Researchers?," MPRA Paper 47414, University Library of Munich, Germany, revised Dec 2012.
    3. Anne-Wil Harzing, 2013. "A preliminary test of Google Scholar as a source for citation data: a longitudinal study of Nobel prize winners," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 1057-1075, March.
    4. Christos Ziakis & Maro Vlachopoulou & Theodosios Kyrkoudis & Makrina Karagkiozidou, 2019. "Important Factors for Improving Google Search Rank," Future Internet, MDPI, vol. 11(2), pages 1-12, January.
    5. Anne-Wil Harzing & Satu Alakangas, 2017. "Microsoft Academic: is the phoenix getting wings?," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 371-383, January.
    6. Joost C. F. Winter & Amir A. Zadpoor & Dimitra Dodou, 2014. "The expansion of Google Scholar versus Web of Science: a longitudinal study," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(2), pages 1547-1565, February.
    7. Isidro F. Aguillo, 2012. "Is Google Scholar useful for bibliometrics? A webometric analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 343-351, May.
    8. Anne-Wil Harzing, 2014. "A longitudinal study of Google Scholar coverage between 2012 and 2013," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 565-575, January.
    9. Sven E. Hug & Michael Ochsner & Martin P. Brändle, 2017. "Citation analysis with microsoft academic," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 371-378, April.
    10. Emilio Delgado López-Cózar & Nicolás Robinson-García & Daniel Torres-Salinas, 2014. "The Google scholar experiment: How to index false papers and manipulate bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 446-454, March.
    11. Sven E. Hug & Martin P. Brändle, 2017. "The coverage of Microsoft Academic: analyzing the publication output of a university," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1551-1571, December.
    12. Hamid R. Jamali & Majid Nabavi, 2015. "Open access and sources of full-text articles in Google Scholar in different subject fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 1635-1651, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Andreas Veglis & Dimitrios Giomelakis, 2019. "Search Engine Optimization," Future Internet, MDPI, vol. 12(1), pages 1-2, December.
    2. Sandro Serpa & Maria José Sá & Ana Isabel Santos & Carlos Miguel Ferreira, 2020. "Challenges for the Academic Editor in the Scientific Publication," Academic Journal of Interdisciplinary Studies, Richtmann Publishing Ltd, vol. 9, May.
    3. Tamás Stadler & Ágoston Temesi & Zoltán Lakner, 2022. "Soil Chemical Pollution and Military Actions: A Bibliometric Analysis," Sustainability, MDPI, vol. 14(12), pages 1-17, June.
    4. Cristòfol Rovira & Lluís Codina & Carlos Lopezosa, 2021. "Language Bias in the Google Scholar Ranking Algorithm," Future Internet, MDPI, vol. 13(2), pages 1-17, January.
    5. Goran Matošević & Jasminka Dobša & Dunja Mladenić, 2021. "Using Machine Learning for Web Page Classification in Search Engine Optimization," Future Internet, MDPI, vol. 13(1), pages 1-20, January.
    6. Andreas Giannakoulopoulos & Nikos Konstantinou & Dimitris Koutsompolis & Minas Pergantis & Iraklis Varlamis, 2019. "Academic Excellence, Website Quality, SEO Performance: Is there a Correlation?," Future Internet, MDPI, vol. 11(11), pages 1-25, November.
    7. Laura Icela González-Pérez & María Soledad Ramírez-Montoya & Francisco José García-Peñalvo, 2021. "Improving Institutional Repositories through User-Centered Design: Indicators from a Focus Group," Future Internet, MDPI, vol. 13(11), pages 1-19, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Cristòfol Rovira & Lluís Codina & Carlos Lopezosa, 2021. "Language Bias in the Google Scholar Ranking Algorithm," Future Internet, MDPI, vol. 13(2), pages 1-17, January.
    2. Moed, Henk F. & Bar-Ilan, Judit & Halevi, Gali, 2016. "A new methodology for comparing Google Scholar and Scopus," Journal of Informetrics, Elsevier, vol. 10(2), pages 533-551.
    3. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    4. Martin-Martin, Alberto & Orduna-Malea, Enrique & Harzing, Anne-Wil & Delgado López-Cózar, Emilio, 2017. "Can we use Google Scholar to identify highly-cited documents?," Journal of Informetrics, Elsevier, vol. 11(1), pages 152-163.
    5. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    6. Sergio Copiello, 2019. "The open access citation premium may depend on the openness and inclusiveness of the indexing database, but the relationship is controversial because it is ambiguous where the open access boundary lie," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 995-1018, November.
    7. Kousha, Kayvan & Thelwall, Mike & Abdoli, Mahshid, 2018. "Can Microsoft Academic assess the early citation impact of in-press articles? A multi-discipline exploratory analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 287-298.
    8. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    9. Michael Thelwall, 2018. "Can Microsoft Academic be used for citation analysis of preprint archives? The case of the Social Science Research Network," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 913-928, May.
    10. Michael Gusenbauer, 2019. "Google Scholar to overshadow them all? Comparing the sizes of 12 academic search engines and bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 177-214, January.
    11. Enrique Orduna-Malea & Selenay Aytac & Clara Y. Tran, 2019. "Universities through the eyes of bibliographic databases: a retroactive growth comparison of Google Scholar, Scopus and Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 433-450, October.
    12. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    13. Anne-Wil Harzing, 2019. "Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science?," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 341-349, July.
    14. Mike Thelwall, 2018. "Does Microsoft Academic find early citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 325-334, January.
    15. Vivek Kumar Singh & Satya Swarup Srichandan & Hiran H. Lathabai, 2022. "ResearchGate and Google Scholar: how much do they differ in publications, citations and different metrics and why?," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(3), pages 1515-1542, March.
    16. Hamid R. Jamali & Majid Nabavi, 2015. "Open access and sources of full-text articles in Google Scholar in different subject fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 1635-1651, December.
    17. Enrique Orduna-Malea & Juan M. Ayllón & Alberto Martín-Martín & Emilio Delgado López-Cózar, 2015. "Methods for estimating the size of Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 931-949, September.
    18. Robin Haunschild & Sven E. Hug & Martin P. Brändle & Lutz Bornmann, 2018. "The number of linked references of publications in Microsoft Academic in comparison with the Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 367-370, January.
    19. Kousha, Kayvan & Thelwall, Mike, 2018. "Can Microsoft Academic help to assess the citation impact of academic books?," Journal of Informetrics, Elsevier, vol. 12(3), pages 972-984.
    20. Anne-Wil Harzing & Satu Alakangas, 2016. "Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 787-804, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jftint:v:11:y:2019:i:9:p:202-:d:268805. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.