IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v11y2017i1p128-151.html
   My bibliography  Save this article

Three practical field normalised alternative indicator formulae for research evaluation

Author

Listed:
  • Thelwall, Mike

Abstract

Although altmetrics and other web-based alternative indicators are now commonplace in publishers’ websites, they can be difficult for research evaluators to use because of the time or expense of the data, the need to benchmark in order to assess their values, the high proportion of zeros in some alternative indicators, and the time taken to calculate multiple complex indicators. These problems are addressed here by (a) a field normalisation formula, the Mean Normalised Log-transformed Citation Score (MNLCS) that allows simple confidence limits to be calculated and is similar to a proposal of Lundberg, (b) field normalisation formulae for the proportion of cited articles in a set, the Equalised Mean-based Normalised Proportion Cited (EMNPC) and the Mean-based Normalised Proportion Cited (MNPC), to deal with mostly uncited data sets, (c) a sampling strategy to minimise data collection costs, and (d) free unified software to gather the raw data, implement the sampling strategy, and calculate the indicator formulae and confidence limits. The approach is demonstrated (but not fully tested) by comparing the Scopus citations, Mendeley readers and Wikipedia mentions of research funded by Wellcome, NIH, and MRC in three large fields for 2013–2016. Within the results, statistically significant differences in both citation counts and Mendeley reader counts were found even for sets of articles that were less than six months old. Mendeley reader counts were more precise than Scopus citations for the most recent articles and all three funders could be demonstrated to have an impact in Wikipedia that was significantly above the world average.

Suggested Citation

  • Thelwall, Mike, 2017. "Three practical field normalised alternative indicator formulae for research evaluation," Journal of Informetrics, Elsevier, vol. 11(1), pages 128-151.
  • Handle: RePEc:eee:infome:v:11:y:2017:i:1:p:128-151
    DOI: 10.1016/j.joi.2016.12.002
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115771630205X
    Download Restriction: Full text for ScienceDirect subscribers only

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Adam B. Jaffe, 2002. "Building Programme Evaluation into the Design of Public Research-Support Programmes," Oxford Review of Economic Policy, Oxford University Press, vol. 18(1), pages 22-34, Spring.
    2. Thelwall, Mike, 2016. "The discretised lognormal and hooked power law distributions for complete citation data: Best options for modelling and regression," Journal of Informetrics, Elsevier, vol. 10(2), pages 336-346.
    3. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
    4. Lutz Bornmann & Robin Haunschild, 2016. "How to normalize Twitter counts? A first attempt based on journals in the Twitter Index," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1405-1422, June.
    5. Li, Jiang & Qiao, Lili & Li, Wenyuze & Jin, Yidan, 2014. "Chinese-language articles are not biased in citations: Evidences from Chinese-English bilingual journals in Scopus and Web of Science," Journal of Informetrics, Elsevier, vol. 8(4), pages 912-916.
    6. Mike Thelwall & Paul Wilson, 2016. "Mendeley readership altmetrics for medical articles: An analysis of 45 fields," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(8), pages 1962-1972, August.
    7. Lundberg, Jonas, 2007. "Lifting the crown—citation z-score," Journal of Informetrics, Elsevier, vol. 1(2), pages 145-154.
    8. M.H. MacRoberts & B.R. MacRoberts, 2010. "Problems of citation analysis: A study of uncited and seldom-cited influences," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 1-12, January.
    9. Adam B. Jaffe & Manuel Trajtenberg & Rebecca Henderson, 1993. "Geographic Localization of Knowledge Spillovers as Evidenced by Patent Citations," The Quarterly Journal of Economics, Oxford University Press, vol. 108(3), pages 577-598.
    10. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    11. Opthof, Tobias & Leydesdorff, Loet, 2010. "Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance," Journal of Informetrics, Elsevier, vol. 4(3), pages 423-430.
    12. Fairclough, Ruth & Thelwall, Mike, 2015. "National research impact indicators from Mendeley readers," Journal of Informetrics, Elsevier, vol. 9(4), pages 845-859.
    13. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    14. van Raan, Anthony F.J. & van Leeuwen, Thed N. & Visser, Martijn S. & van Eck, Nees Jan & Waltman, Ludo, 2010. "Rivals for the crown: Reply to Opthof and Leydesdorff," Journal of Informetrics, Elsevier, vol. 4(3), pages 431-435.
    15. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    16. Thelwall, Mike, 2016. "Citation count distributions for large monodisciplinary journals," Journal of Informetrics, Elsevier, vol. 10(3), pages 863-874.
    17. Williams, Richard & Bornmann, Lutz, 2016. "Sampling issues in bibliometric analysis," Journal of Informetrics, Elsevier, vol. 10(4), pages 1225-1232.
    18. Thelwall, Mike, 2016. "The precision of the arithmetic mean, geometric mean and percentiles for citation data: An experimental simulation modelling approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 110-123.
    19. Thelwall, Mike, 2016. "Are there too many uncited articles? Zero inflated variants of the discretised lognormal and hooked power law distributions," Journal of Informetrics, Elsevier, vol. 10(2), pages 622-633.
    20. Thelwall, Mike & Sud, Pardeep, 2016. "National, disciplinary and temporal variations in the extent to which articles with more authors have more impact: Evidence from a geometric field normalised citation indicator," Journal of Informetrics, Elsevier, vol. 10(1), pages 48-61.
    21. Ehsan Mohammadi & Mike Thelwall & Stefanie Haustein & Vincent Larivière, 2015. "Who reads research articles? An altmetrics analysis of Mendeley user categories," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(9), pages 1832-1846, September.
    22. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    23. Thelwall, Mike, 2016. "Are the discretised lognormal and hooked power law distributions plausible for citation data?," Journal of Informetrics, Elsevier, vol. 10(2), pages 454-470.
    24. Wallace, Matthew L. & Larivière, Vincent & Gingras, Yves, 2009. "Modeling a century of citation distributions," Journal of Informetrics, Elsevier, vol. 3(4), pages 296-303.
    25. Ehsan Mohammadi & Mike Thelwall & Kayvan Kousha, 2016. "Can Mendeley bookmarks reflect readership? A survey of user motivations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(5), pages 1198-1209, May.
    26. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Robin Haunschild & Lutz Bornmann, 2018. "Field- and time-normalization of data with many zeros: an empirical analysis using citation and Twitter data," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 997-1012, August.
    2. Lutz Bornmann & Klaus Wohlrabe, 2017. "Normalization of Citation Impact in Economics," CESifo Working Paper Series 6592, CESifo Group Munich.
    3. Mike Thelwall & Kayvan Kousha, 2017. "ResearchGate versus Google Scholar: Which finds more early citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 1125-1131, August.
    4. Thelwall, Mike, 2018. "Microsoft Academic automatic document searches: Accuracy for journal articles and suitability for citation analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 1-9.
    5. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    6. Mike Thelwall, 2018. "Differences between journals and years in the proportions of students, researchers and faculty registering Mendeley articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 717-729, May.
    7. Mike Thelwall & Tamara Nevill, 2019. "No evidence of citation bias as a determinant of STEM gender disparities in US biochemistry, genetics and molecular biology research," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(3), pages 1793-1801, December.
    8. Kousha, Kayvan & Thelwall, Mike & Abdoli, Mahshid, 2018. "Can Microsoft Academic assess the early citation impact of in-press articles? A multi-discipline exploratory analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 287-298.
    9. Bornmann, Lutz & Haunschild, Robin, 2018. "Normalization of zero-inflated data: An empirical analysis of a new indicator family and its use with altmetrics data," Journal of Informetrics, Elsevier, vol. 12(3), pages 998-1011.
    10. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    11. Mike Thelwall, 2018. "Does Microsoft Academic find early citations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 325-334, January.
    12. Thelwall, Mike, 2018. "Do females create higher impact research? Scopus citations and Mendeley readers for articles from five countries," Journal of Informetrics, Elsevier, vol. 12(4), pages 1031-1041.
    13. Mike Thelwall & Kayvan Kousha & Mahshid Abdoli, 2017. "Is medical research informing professional practice more highly cited? Evidence from AHFS DI Essentials in drugs.com," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 509-527, July.
    14. Thelwall, Mike & Fairclough, Ruth, 2017. "The accuracy of confidence intervals for field normalised indicators," Journal of Informetrics, Elsevier, vol. 11(2), pages 530-540.
    15. Alberto Martín-Martín & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2018. "Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2175-2188, September.
    16. Mike Thelwall, 2017. "Are Mendeley reader counts useful impact indicators in all fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1721-1731, December.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:1:p:128-151. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu). General contact details of provider: http://www.elsevier.com/locate/joi .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.