IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v109y2016i3d10.1007_s11192-016-2118-8.html
   My bibliography  Save this article

Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research

Author

Listed:
  • Amalia Mas-Bleda

    () (University of Wolverhampton)

  • Mike Thelwall

    (University of Wolverhampton)

Abstract

Abstract This study compares Spanish and UK research in eight subject fields using a range of bibliometric and social media indicators. For each field, lists of Spanish and UK journal articles published in the year 2012 and their citation counts were extracted from Scopus. The software Webometric Analyst was then used to extract a range of altmetrics for these articles, including patent citations, online presentation mentions, online course syllabus mentions, Wikipedia mentions and Mendeley reader counts and Altmetric.com was used to extract Twitter mentions. Results show that Mendeley is the altmetric source with the highest coverage, with 80 % of sampled articles having one or more Mendeley readers, followed by Twitter (34 %). The coverage of the remaining sources was lower than 3 %. All of the indicators checked either have too little data or increase the overall difference between Spain and the UK and so none can be suggested as alternatives to reduce the bias against Spain in traditional citation indexes.

Suggested Citation

  • Amalia Mas-Bleda & Mike Thelwall, 2016. "Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2007-2030, December.
  • Handle: RePEc:spr:scient:v:109:y:2016:i:3:d:10.1007_s11192-016-2118-8
    DOI: 10.1007/s11192-016-2118-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-2118-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Henk F. Moed & Gali Halevi, 2015. "Multidimensional assessment of scholarly research impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 1988-2002, October.
    2. Fairclough, Ruth & Thelwall, Mike, 2015. "National research impact indicators from Mendeley readers," Journal of Informetrics, Elsevier, vol. 9(4), pages 845-859.
    3. Loet Leydesdorff & Félix de Moya-Anegón & Vicente P. Guerrero-Bote, 2010. "Journal maps on the basis of Scopus data: A comparison with the Journal Citation Reports of the ISI," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(2), pages 352-369, February.
    4. Mike Thelwall & Nabeil Maflahi, 2015. "Are scholarly articles disproportionately read in their own country? An analysis of mendeley readers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(6), pages 1124-1135, June.
    5. Brendan Luyt & Daniel Tan, 2010. "Improving Wikipedia's credibility: References and citations in a sample of history articles," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(4), pages 715-722, April.
    6. Ehsan Mohammadi & Mike Thelwall & Kayvan Kousha, 2016. "Can Mendeley bookmarks reflect readership? A survey of user motivations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(5), pages 1198-1209, May.
    7. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
    8. Kayvan Kousha & Mike Thelwall, 2006. "Motivations for URL citations to open access library and information science articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 68(3), pages 501-517, September.
    9. Mike Thelwall & Paul Wilson, 2016. "Mendeley readership altmetrics for medical articles: An analysis of 45 fields," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(8), pages 1962-1972, August.
    10. Ehsan Mohammadi & Mike Thelwall, 2014. "Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(8), pages 1627-1638, August.
    11. Thelwall, Mike & Sud, Pardeep, 2012. "Webometric research with the Bing Search API 2.0," Journal of Informetrics, Elsevier, vol. 6(1), pages 44-52.
    12. Ortega, José Luis, 2015. "Relationship between altmetric and bibliometric indicators across academic social sites: The case of CSIC's members," Journal of Informetrics, Elsevier, vol. 9(1), pages 39-49.
    13. Wei Jeng & Daqing He & Jiepu Jiang, 2015. "User participation in an academic social networking service: A survey of open group users on Mendeley," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(5), pages 890-904, May.
    14. Hamid R. Jamali & David Nicholas & Eti Herman, 2016. "Scholarly reputation in the digital age and the role of emerging platforms and mechanisms," Research Evaluation, Oxford University Press, vol. 25(1), pages 37-49.
    15. Stefanie Haustein & Isabella Peters & Cassidy R. Sugimoto & Mike Thelwall & Vincent Larivière, 2014. "Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(4), pages 656-669, April.
    16. Ehsan Mohammadi & Mike Thelwall & Stefanie Haustein & Vincent Larivière, 2015. "Who reads research articles? An altmetrics analysis of Mendeley user categories," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(9), pages 1832-1846, September.
    17. Rodrigo Costas & Zohreh Zahedi & Paul Wouters, 2015. "Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 2003-2019, October.
    18. Bornmann, Lutz, 2014. "Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics," Journal of Informetrics, Elsevier, vol. 8(4), pages 895-903.
    19. Narin, Francis & Hamilton, Kimberly S. & Olivastro, Dominic, 1997. "The increasing linkage between U.S. technology and public science," Research Policy, Elsevier, vol. 26(3), pages 317-330, October.
    20. Antal Bosch & Toine Bogers & Maurice Kunder, 2016. "Estimating search engine index size variability: a 9-year longitudinal study," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 839-856, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Bornmann, Lutz & Haunschild, Robin, 2018. "Normalization of zero-inflated data: An empirical analysis of a new indicator family and its use with altmetrics data," Journal of Informetrics, Elsevier, vol. 12(3), pages 998-1011.
    2. Sven E. Hug & Martin P. Brändle, 2017. "The coverage of Microsoft Academic: analyzing the publication output of a university," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1551-1571, December.
    3. Bornmann, Lutz & Haunschild, Robin & Adams, Jonathan, 2019. "Do altmetrics assess societal impact in a comparable way to case studies? An empirical test of the convergent validity of altmetrics based on data from the UK research excellence framework (REF)," Journal of Informetrics, Elsevier, vol. 13(1), pages 325-340.
    4. Lutz Bornmann & Robin Haunschild, 2018. "Allegation of scientific misconduct increases Twitter attention," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1097-1100, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:109:y:2016:i:3:d:10.1007_s11192-016-2118-8. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla) or (Springer Nature Abstracting and Indexing). General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.