IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v98y2014i1d10.1007_s11192-013-1161-y.html
   My bibliography  Save this article

How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations

Author

Listed:
  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

  • Werner Marx

    (Max Planck Institute for Solid State Research)

Abstract

Although bibliometrics has been a separate research field for many years, there is still no uniformity in the way bibliometric analyses are applied to individual researchers. Therefore, this study aims to set up proposals how to evaluate individual researchers working in the natural and life sciences. 2005 saw the introduction of the h index, which gives information about a researcher’s productivity and the impact of his or her publications in a single number (h is the number of publications with at least h citations); however, it is not possible to cover the multidimensional complexity of research performance and to undertake inter-personal comparisons with this number. This study therefore includes recommendations for a set of indicators to be used for evaluating researchers. Our proposals relate to the selection of data on which an evaluation is based, the analysis of the data and the presentation of the results.

Suggested Citation

  • Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
  • Handle: RePEc:spr:scient:v:98:y:2014:i:1:d:10.1007_s11192-013-1161-y
    DOI: 10.1007/s11192-013-1161-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-013-1161-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-013-1161-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Marx, Werner & Schier, Hermann & Rahm, Erhard & Thor, Andreas & Daniel, Hans-Dieter, 2009. "Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published els," Journal of Informetrics, Elsevier, vol. 3(1), pages 27-35.
    2. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    3. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    4. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, July.
    5. Bornmann, Lutz, 2013. "The problem of citation impact assessments for recent publication years in institutional evaluations," Journal of Informetrics, Elsevier, vol. 7(3), pages 722-729.
    6. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2011. "Evaluating research: from informed peer review to bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 499-514, June.
    7. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    8. Andreas Strotmann & Dangzhi Zhao, 2012. "Author name disambiguation: What difference does it make in author-based citation analysis?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(9), pages 1820-1833, September.
    9. Pedro Albarrán & Javier Ruiz‐Castillo, 2011. "References made and citations received by scientific articles," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(1), pages 40-49, January.
    10. John Panaretos & Chrisovaladis Malesios, 2009. "Assessing scientific research performance and impact with single indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 635-670, December.
    11. Cassidy R. Sugimoto & Blaise Cronin, 2012. "Biobibliometric profiling: An examination of multifaceted approaches to scholarship," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(3), pages 450-468, March.
    12. Ryan D. Duffy & Alex Jadidian & Gregory D. Webster & Kyle J. Sandell, 2011. "The research productivity of academic psychologists: assessment, trends, and best practice recommendations," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 207-227, October.
    13. Bornmann, Lutz & Ozimek, Adam, 2012. "Stata commands for importing bibliometric data and processing author address information," Journal of Informetrics, Elsevier, vol. 6(4), pages 505-512.
    14. Moed, H. F. & Hesselink, F. Th., 1996. "The publication output and impact of academic chemistry research in the Netherlands during the 1980s: bibliometric analyses and policy implications," Research Policy, Elsevier, vol. 25(5), pages 819-836, August.
    15. Ciriaco Andrea D'Angelo & Cristiano Giuffrida & Giovanni Abramo, 2011. "A heuristic approach to author name disambiguation in bibliometrics databases for large-scale research assessments," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(2), pages 257-269, February.
    16. Linda Butler & Martijn S. Visser, 2006. "Extending citation analysis to non-source items," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(2), pages 327-343, February.
    17. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    18. Rickard Danell, 2011. "Can the quality of scientific work be predicted using information on the author's track record?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(1), pages 50-60, January.
    19. Kosmulski, Marek, 2011. "Successful papers: A new idea in evaluation of scientific output," Journal of Informetrics, Elsevier, vol. 5(3), pages 481-485.
    20. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    21. Lin Zhang & Wolfgang Glänzel, 2012. "Where demographics meets scientometrics: towards a dynamic career analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 617-630, May.
    22. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h-index," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    23. Lutz Bornmann, 2013. "How to analyze percentile citation impact data meaningfully in bibliometrics: The statistical analysis of distributions, percentile rank classes, and top-cited papers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(3), pages 587-595, March.
    24. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    25. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    26. Fiorenzo Franceschini & Maurizio Galetto & Domenico Maisano & Luca Mastrogiacomo, 2012. "The success-index: an alternative approach to the h-index for evaluating an individual’s research output," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(3), pages 621-641, September.
    27. Dag W. Aksnes, 2003. "A macro study of self-citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(2), pages 235-246, February.
    28. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2010. "Testing the trade-off between productivity and quality in research activities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 132-140, January.
    29. Sune Lehmann & Andrew D. Jackson & Benny E. Lautrup, 2008. "A quantitative analysis of indicators of scientific performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 76(2), pages 369-390, August.
    30. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    31. Wolfgang Glänzel & Koenraad Debackere & Bart Thijs & András Schubert, 2006. "A concise review on the role of author self-citations in information science, bibliometrics and science policy," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(2), pages 263-277, May.
    32. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    33. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2011. "Assessing the varying level of impact measurement accuracy as a function of the citation window length," Journal of Informetrics, Elsevier, vol. 5(4), pages 659-667.
    34. Félix Moya-Anegón & Vicente P. Guerrero-Bote & Lutz Bornmann & Henk F. Moed, 2013. "The research guarantors of scientific papers and the output counting: a promising new approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 421-434, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    2. Lutz Bornmann & Robin Haunschild, 2018. "Plots for visualizing paper impact and journal impact of single researchers in a single graph," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 385-394, April.
    3. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    4. Vîiu, Gabriel-Alexandru, 2017. "Disaggregated research evaluation through median-based characteristic scores and scales: a comparison with the mean-based approach," Journal of Informetrics, Elsevier, vol. 11(3), pages 748-765.
    5. Lutz Bornmann & Richard Williams, 0. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 0, pages 1-22.
    6. Robin Haunschild & Lutz Bornmann & Jonathan Adams, 2019. "R package for producing beamplots as a preferred alternative to the h index when assessing single researchers (based on downloads from Web of Science)," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 925-927, August.
    7. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    8. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    9. Péter Vinkler, 2019. "Core journals and elite subsets in scientometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 241-259, October.
    10. Mutz, Rüdiger & Daniel, Hans-Dieter, 2018. "The bibliometric quotient (BQ), or how to measure a researcher’s performance capacity: A Bayesian Poisson Rasch model," Journal of Informetrics, Elsevier, vol. 12(4), pages 1282-1295.
    11. Lutz Bornmann & Loet Leydesdorff, 2018. "Count highly-cited papers instead of papers with h citations: use normalized citation counts and compare “like with like”!," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1119-1123, May.
    12. Gerson Pech & Catarina Delgado, 2020. "Percentile and stochastic-based approach to the comparison of the number of citations of articles indexed in different bibliographic databases," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 223-252, April.
    13. Alberto Martín-Martín & Enrique Orduna-Malea & Emilio Delgado López-Cózar, 2018. "Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2175-2188, September.
    14. David A. Pendlebury, 2019. "Charting a path between the simple and the false and the complex and unusable: Review of Henk F. Moed, Applied Evaluative Informetrics [in the series Qualitative and Quantitative Analysis of Scientifi," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 549-560, April.
    15. Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
    16. Ruben Miranda & Esther Garcia-Carpintero, 2019. "Comparison of the share of documents and citations from different quartile journals in 25 research areas," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 479-501, October.
    17. Tahamtan, Iman & Bornmann, Lutz, 2018. "Core elements in the process of citing publications: Conceptual overview of the literature," Journal of Informetrics, Elsevier, vol. 12(1), pages 203-216.
    18. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    19. Loet Leydesdorff & Lutz Bornmann & Tobias Opthof, 2019. "hα: the scientist as chimpanzee or bonobo," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1163-1166, March.
    20. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Jianhua Hou & Xiucai Yang & Chaomei Chen, 2018. "Emerging trends and new developments in information science: a document co-citation analysis (2009–2016)," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 869-892, May.
    3. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    4. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    5. Loet Leydesdorff & Lutz Bornmann & Tobias Opthof, 2019. "hα: the scientist as chimpanzee or bonobo," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1163-1166, March.
    6. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    7. Marcel Clermont & Johanna Krolak & Dirk Tunger, 2021. "Does the citation period have any effect on the informative value of selected citation indicators in research evaluations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1019-1047, February.
    8. Guoliang Lyu & Ganwei Shi, 2019. "On an approach to boosting a journal’s citation potential," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(3), pages 1387-1409, September.
    9. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2018. "Technological research in the EU is less efficient than the world average. EU research policy risks Europeans’ future," Journal of Informetrics, Elsevier, vol. 12(3), pages 718-731.
    10. Maziar Montazerian & Edgar Dutra Zanotto & Hellmut Eckert, 2019. "A new parameter for (normalized) evaluation of H-index: countries as a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1065-1078, March.
    11. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    12. Bouyssou, Denis & Marchant, Thierry, 2014. "An axiomatic approach to bibliometric rankings and indices," Journal of Informetrics, Elsevier, vol. 8(3), pages 449-477.
    13. Brito, Ricardo & Navarro, Alonso Rodríguez, 2021. "The inconsistency of h-index: A mathematical analysis," Journal of Informetrics, Elsevier, vol. 15(1).
    14. Brandão, Luana Carneiro & Soares de Mello, João Carlos Correia Baptista, 2019. "A multi-criteria approach to the h-index," European Journal of Operational Research, Elsevier, vol. 276(1), pages 357-363.
    15. Ana Paula dos Santos Rubem & Ariane Lima Moura & João Carlos Correia Baptista Soares de Mello, 2015. "Comparative analysis of some individual bibliometric indices when applied to groups of researchers," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 1019-1035, January.
    16. Andersen, Jens Peter, 2017. "An empirical and theoretical critique of the Euclidean index," Journal of Informetrics, Elsevier, vol. 11(2), pages 455-465.
    17. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    18. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    19. Lutz Bornmann & Loet Leydesdorff, 2018. "Count highly-cited papers instead of papers with h citations: use normalized citation counts and compare “like with like”!," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 1119-1123, May.
    20. Antonio Abatemarco & Roberto Dell’Anno, 2013. "Certainty equivalent citation: generalized classes of citation indexes," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(1), pages 263-271, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:98:y:2014:i:1:d:10.1007_s11192-013-1161-y. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: . General contact details of provider: http://www.springer.com .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.