IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v101y2014i2d10.1007_s11192-014-1264-0.html
   My bibliography  Save this article

How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications

Author

Listed:
  • Zohreh Zahedi

    () (Leiden University)

  • Rodrigo Costas

    () (Leiden University)

  • Paul Wouters

    () (Leiden University)

Abstract

In this paper an analysis of the presence and possibilities of altmetrics for bibliometric and performance analysis is carried out. Using the web based tool Impact Story, we collected metrics for 20,000 random publications from the Web of Science. We studied both the presence and distribution of altmetrics in the set of publications, across fields, document types and over publication years, as well as the extent to which altmetrics correlate with citation indicators. The main result of the study is that the altmetrics source that provides the most metrics is Mendeley, with metrics on readerships for 62.6 % of all the publications studied, other sources only provide marginal information. In terms of relation with citations, a moderate spearman correlation (r = 0.49) has been found between Mendeley readership counts and citation indicators. Other possibilities and limitations of these indicators are discussed and future research lines are outlined.

Suggested Citation

  • Zohreh Zahedi & Rodrigo Costas & Paul Wouters, 2014. "How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1491-1513, November.
  • Handle: RePEc:spr:scient:v:101:y:2014:i:2:d:10.1007_s11192-014-1264-0
    DOI: 10.1007/s11192-014-1264-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-014-1264-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    2. Haustein, Stefanie & Siebenlist, Tobias, 2011. "Applying social bookmarking data to evaluate journal usage," Journal of Informetrics, Elsevier, vol. 5(3), pages 446-457.
    3. Anton J. Nederhof, 2006. "Bibliometric monitoring of research performance in the Social Sciences and the Humanities: A Review," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(1), pages 81-100, January.
    4. Anthony F. J. Raan & Thed N. Leeuwen & Martijn S. Visser, 2011. "Severe language effect in university rankings: particularly Germany and France are wronged in citation-based rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(2), pages 495-498, August.
    5. Henk F Moed, 2007. "The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review," Science and Public Policy, Oxford University Press, vol. 34(8), pages 575-583, October.
    6. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    7. Mike Thelwall, 2012. "Journal impact evaluation: a webometric perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 429-441, August.
    8. Xuemei Li & Mike Thelwall & Dean Giustini, 2012. "Validating online reference managers for scholarly impact measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 461-471, May.
    9. Maria Bordons & M. T. Fernández & Isabel Gómez, 2002. "Advantages and limitations in the use of impact factor measures for the assessment of research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 53(2), pages 195-206, February.
    10. Mike Thelwall, 2004. "Weak benchmarking indicators for formative and semi-evaluative assessment of research," Research Evaluation, Oxford University Press, vol. 13(1), pages 63-68, April.
    Full references (including those not matched with items on IDEAS)

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:101:y:2014:i:2:d:10.1007_s11192-014-1264-0. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Sonal Shukla) or (Springer Nature Abstracting and Indexing). General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.