IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v10y2016i1p62-73.html
   My bibliography  Save this article

Normalization of Mendeley reader counts for impact assessment

Author

Listed:
  • Haunschild, Robin
  • Bornmann, Lutz

Abstract

A different number of citations can be expected for publications appearing in different subject categories and publication years. For this reason, the citation-based normalized indicator Mean Normalized Citation Score (MNCS) is used in bibliometrics. Mendeley is one of the most important sources of altmetrics data. Mendeley reader counts reflect the impact of publications in terms of readership. Since a significant influence of publication year and discipline has also been observed in the case of Mendeley reader counts, reader impact should not be estimated without normalization. In this study, all articles and reviews of the Web of Science core collection with a publication year of 2012 (and a DOI) are used to normalize their Mendeley reader counts. A new indicator that determines the normalized reader impact is obtained –the Mean Normalized Reader Score (MNRS) – and compared with the MNCS. The MNRS enables us to compare the impact a paper has had on Mendeley across subject categories and publication years. Comparisons on the journal and university level show that the MNRS and MNCS correlate larger for 9601 journals than for 76 German universities.

Suggested Citation

  • Haunschild, Robin & Bornmann, Lutz, 2016. "Normalization of Mendeley reader counts for impact assessment," Journal of Informetrics, Elsevier, vol. 10(1), pages 62-73.
  • Handle: RePEc:eee:infome:v:10:y:2016:i:1:p:62-73
    DOI: 10.1016/j.joi.2015.11.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115771520051X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2015.11.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Neus Herranz & Javier Ruiz-Castillo, 2012. "Multiplicative and fractional strategies when journals are assigned to several subfields," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(11), pages 2195-2205, November.
    2. Fairclough, Ruth & Thelwall, Mike, 2015. "National research impact indicators from Mendeley readers," Journal of Informetrics, Elsevier, vol. 9(4), pages 845-859.
    3. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    4. Mike Thelwall & Nabeil Maflahi, 2015. "Are scholarly articles disproportionately read in their own country? An analysis of mendeley readers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(6), pages 1124-1135, June.
    5. Pedro Albarrán & Juan A. Crespo & Ignacio Ortuño & Javier Ruiz-Castillo, 2011. "The skewness of science in 219 sub-fields and a number of aggregates," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(2), pages 385-397, August.
    6. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    7. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    8. Michel Zitt & Henry Small, 2008. "Modifying the journal impact factor by fractional citation weighting: The audience factor," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1856-1860, September.
    9. Jason Priem, 2013. "Beyond the paper," Nature, Nature, vol. 495(7442), pages 437-440, March.
    10. Bornmann, Lutz, 2014. "Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics," Journal of Informetrics, Elsevier, vol. 8(4), pages 895-903.
    11. Adam Dinsmore & Liz Allen & Kevin Dolby, 2014. "Alternative Perspectives on Impact: The Potential of ALMs and Altmetrics to Inform Funders about Research Impact," PLOS Biology, Public Library of Science, vol. 12(11), pages 1-4, November.
    12. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    13. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "Towards a new crown indicator: an empirical analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 467-481, June.
    14. Anne-Wil Harzing, 2013. "Document categories in the ISI Web of Knowledge: Misunderstanding the Social Sciences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 23-34, January.
    15. Waltman, Ludo & van Eck, Nees Jan, 2015. "Field-normalized citation impact indicators and the choice of an appropriate counting method," Journal of Informetrics, Elsevier, vol. 9(4), pages 872-894.
    16. Ehsan Mohammadi & Mike Thelwall, 2014. "Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(8), pages 1627-1638, August.
    17. Wei Jeng & Daqing He & Jiepu Jiang, 2015. "User participation in an academic social networking service: A survey of open group users on Mendeley," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(5), pages 890-904, May.
    18. Björn Hammarfelt, 2014. "Using altmetrics for assessing research impact in the humanities," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1419-1430, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Robin Haunschild & Lutz Bornmann, 2018. "Field- and time-normalization of data with many zeros: an empirical analysis using citation and Twitter data," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 997-1012, August.
    2. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    3. Liwei Zhang & Jue Wang, 2021. "What affects publications’ popularity on Twitter?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 9185-9198, November.
    4. Manika Lamba, 2020. "Research productivity of health care policy faculty: a cohort study of Harvard Medical School," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 107-130, July.
    5. Yan, Weiwei & Zhang, Yin, 2018. "Research universities on the ResearchGate social networking site: An examination of institutional differences, research activity level, and social networks formed," Journal of Informetrics, Elsevier, vol. 12(1), pages 385-400.
    6. Lutz Bornmann & Rüdiger Mutz & Robin Haunschild & Felix Moya-Anegon & Mirko Almeida Madeira Clemente & Moritz Stefaner, 2021. "Mapping the impact of papers on various status groups in excellencemapping.net: a new release of the excellence mapping tool based on citation and reader scores," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 9305-9331, November.
    7. Lutz Bornmann & Robin Haunschild, 2016. "How to normalize Twitter counts? A first attempt based on journals in the Twitter Index," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1405-1422, June.
    8. Ying Guo & Xiantao Xiao, 2022. "Author-level altmetrics for the evaluation of Chinese scholars," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 973-990, February.
    9. Robin Haunschild & Lutz Bornmann, 2017. "How many scientific papers are mentioned in policy-related documents? An empirical investigation using Web of Science and Altmetric data," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1209-1216, March.
    10. Ortega, José Luis, 2018. "The life cycle of altmetric impact: A longitudinal study of six metrics from PlumX," Journal of Informetrics, Elsevier, vol. 12(3), pages 579-589.
    11. Gerson Pech & Catarina Delgado, 2020. "Assessing the publication impact using citation data from both Scopus and WoS databases: an approach validated in 15 research fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 909-924, November.
    12. Houqiang Yu, 2017. "Context of altmetrics data matters: an investigation of count type and user category," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 267-283, April.
    13. Bornmann, Lutz & Haunschild, Robin, 2018. "Normalization of zero-inflated data: An empirical analysis of a new indicator family and its use with altmetrics data," Journal of Informetrics, Elsevier, vol. 12(3), pages 998-1011.
    14. Pech, Gerson & Delgado, Catarina, 2021. "Screening the most highly cited papers in longitudinal bibliometric studies and systematic literature reviews of a research field or journal: Widespread used metrics vs a percentile citation-based app," Journal of Informetrics, Elsevier, vol. 15(3).
    15. Cao, Xuanyu & Chen, Yan & Ray Liu, K.J., 2016. "A data analytic approach to quantifying scientific impact," Journal of Informetrics, Elsevier, vol. 10(2), pages 471-484.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    3. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    4. Amalia Mas-Bleda & Mike Thelwall, 2016. "Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2007-2030, December.
    5. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    6. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    7. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    8. Thelwall, Mike, 2017. "Three practical field normalised alternative indicator formulae for research evaluation," Journal of Informetrics, Elsevier, vol. 11(1), pages 128-151.
    9. Herranz, Neus & Ruiz-Castillo, Javier, 2012. "Sub-field normalization in the multiplicative case: Average-based citation indicators," Journal of Informetrics, Elsevier, vol. 6(4), pages 543-556.
    10. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    11. Fairclough, Ruth & Thelwall, Mike, 2015. "National research impact indicators from Mendeley readers," Journal of Informetrics, Elsevier, vol. 9(4), pages 845-859.
    12. Bornmann, Lutz & Leydesdorff, Loet, 2017. "Skewness of citation impact data and covariates of citation distributions: A large-scale empirical analysis based on Web of Science data," Journal of Informetrics, Elsevier, vol. 11(1), pages 164-175.
    13. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    14. Thelwall, Mike, 2018. "Do females create higher impact research? Scopus citations and Mendeley readers for articles from five countries," Journal of Informetrics, Elsevier, vol. 12(4), pages 1031-1041.
    15. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    16. Bornmann, Lutz & Haunschild, Robin & Mutz, Rüdiger, 2020. "Should citations be field-normalized in evaluative bibliometrics? An empirical analysis based on propensity score matching," Journal of Informetrics, Elsevier, vol. 14(4).
    17. Lin Zhang & Gunnar Sivertsen & Huiying Du & Ying Huang & Wolfgang Glänzel, 2021. "Gender differences in the aims and impacts of research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(11), pages 8861-8886, November.
    18. Bornmann, Lutz & Haunschild, Robin, 2018. "Normalization of zero-inflated data: An empirical analysis of a new indicator family and its use with altmetrics data," Journal of Informetrics, Elsevier, vol. 12(3), pages 998-1011.
    19. Zhang, Lin & Sivertsen, Gunnar & Du, Huiying & HUANG, Ying & Glänzel, Wolfgang, 2021. "Gender differences in the aims and impacts of research," SocArXiv 9n347, Center for Open Science.
    20. Thelwall, Mike & Sud, Pardeep, 2016. "National, disciplinary and temporal variations in the extent to which articles with more authors have more impact: Evidence from a geometric field normalised citation indicator," Journal of Informetrics, Elsevier, vol. 10(1), pages 48-61.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:10:y:2016:i:1:p:62-73. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.