IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v98y2014i1d10.1007_s11192-013-1161-y.html
   My bibliography  Save this article

How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations

Author

Listed:
  • Lutz Bornmann

    (Administrative Headquarters of the Max Planck Society)

  • Werner Marx

    (Max Planck Institute for Solid State Research)

Abstract

Although bibliometrics has been a separate research field for many years, there is still no uniformity in the way bibliometric analyses are applied to individual researchers. Therefore, this study aims to set up proposals how to evaluate individual researchers working in the natural and life sciences. 2005 saw the introduction of the h index, which gives information about a researcher’s productivity and the impact of his or her publications in a single number (h is the number of publications with at least h citations); however, it is not possible to cover the multidimensional complexity of research performance and to undertake inter-personal comparisons with this number. This study therefore includes recommendations for a set of indicators to be used for evaluating researchers. Our proposals relate to the selection of data on which an evaluation is based, the analysis of the data and the presentation of the results.

Suggested Citation

  • Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
  • Handle: RePEc:spr:scient:v:98:y:2014:i:1:d:10.1007_s11192-013-1161-y
    DOI: 10.1007/s11192-013-1161-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-013-1161-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-013-1161-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, July.
    2. Bornmann, Lutz, 2013. "The problem of citation impact assessments for recent publication years in institutional evaluations," Journal of Informetrics, Elsevier, vol. 7(3), pages 722-729.
    3. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2011. "Evaluating research: from informed peer review to bibliometrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 499-514, June.
    4. Pedro Albarrán & Javier Ruiz‐Castillo, 2011. "References made and citations received by scientific articles," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(1), pages 40-49, January.
    5. John Panaretos & Chrisovaladis Malesios, 2009. "Assessing scientific research performance and impact with single indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 635-670, December.
    6. Cassidy R. Sugimoto & Blaise Cronin, 2012. "Biobibliometric profiling: An examination of multifaceted approaches to scholarship," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(3), pages 450-468, March.
    7. Anthony F.J. van Raan, 2008. "Bibliometric statistical properties of the 100 largest European research universities: Prevalent scaling rules in the science system," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(3), pages 461-475, February.
    8. Ciriaco Andrea D'Angelo & Cristiano Giuffrida & Giovanni Abramo, 2011. "A heuristic approach to author name disambiguation in bibliometrics databases for large‐scale research assessments," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(2), pages 257-269, February.
    9. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    10. Rickard Danell, 2011. "Can the quality of scientific work be predicted using information on the author's track record?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(1), pages 50-60, January.
    11. Cassidy R. Sugimoto & Blaise Cronin, 2012. "Biobibliometric profiling: An examination of multifaceted approaches to scholarship," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(3), pages 450-468, March.
    12. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    13. Blaise Cronin & Lokman I. Meho, 2007. "Timelines of creativity: A study of intellectual innovators in information science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(13), pages 1948-1959, November.
    14. Lutz Bornmann, 2013. "How to analyze percentile citation impact data meaningfully in bibliometrics: The statistical analysis of distributions, percentile rank classes, and top-cited papers," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(3), pages 587-595, March.
    15. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    16. Robert J. W. Tijssen & Martijn S. Visser & Thed N. van Leeuwen, 2002. "Benchmarking international scientific excellence: Are highly cited research papers an appropriate frame of reference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 54(3), pages 381-397, July.
    17. Javier Ruiz-Castillo, 2012. "The evaluation of citation distributions," SERIEs: Journal of the Spanish Economic Association, Springer;Spanish Economic Association, vol. 3(1), pages 291-310, March.
    18. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    19. Fiorenzo Franceschini & Maurizio Galetto & Domenico Maisano & Luca Mastrogiacomo, 2012. "The success-index: an alternative approach to the h-index for evaluating an individual’s research output," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(3), pages 621-641, September.
    20. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2010. "Testing the trade-off between productivity and quality in research activities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(1), pages 132-140, January.
    21. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    22. Lokman I. Meho & Kristina M. Spurgin, 2005. "Ranking the research productivity of library and information science faculty and schools: An evaluation of data sources and research methods," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 56(12), pages 1314-1331, October.
    23. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2010. "Testing the trade‐off between productivity and quality in research activities," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(1), pages 132-140, January.
    24. Bornmann, Lutz & Marx, Werner & Schier, Hermann & Rahm, Erhard & Thor, Andreas & Daniel, Hans-Dieter, 2009. "Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published els," Journal of Informetrics, Elsevier, vol. 3(1), pages 27-35.
    25. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    26. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    27. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    28. Andreas Strotmann & Dangzhi Zhao, 2012. "Author name disambiguation: What difference does it make in author-based citation analysis?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(9), pages 1820-1833, September.
    29. Rickard Danell, 2011. "Can the quality of scientific work be predicted using information on the author's track record?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(1), pages 50-60, January.
    30. Ryan D. Duffy & Alex Jadidian & Gregory D. Webster & Kyle J. Sandell, 2011. "The research productivity of academic psychologists: assessment, trends, and best practice recommendations," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 207-227, October.
    31. Bornmann, Lutz & Ozimek, Adam, 2012. "Stata commands for importing bibliometric data and processing author address information," Journal of Informetrics, Elsevier, vol. 6(4), pages 505-512.
    32. Moed, H. F. & Hesselink, F. Th., 1996. "The publication output and impact of academic chemistry research in the Netherlands during the 1980s: bibliometric analyses and policy implications," Research Policy, Elsevier, vol. 25(5), pages 819-836, August.
    33. Ciriaco Andrea D'Angelo & Cristiano Giuffrida & Giovanni Abramo, 2011. "A heuristic approach to author name disambiguation in bibliometrics databases for large-scale research assessments," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(2), pages 257-269, February.
    34. Linda Butler & Martijn S. Visser, 2006. "Extending citation analysis to non-source items," Scientometrics, Springer;Akadémiai Kiadó, vol. 66(2), pages 327-343, February.
    35. Andreas Strotmann & Dangzhi Zhao, 2012. "Author name disambiguation: What difference does it make in author‐based citation analysis?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(9), pages 1820-1833, September.
    36. Kosmulski, Marek, 2011. "Successful papers: A new idea in evaluation of scientific output," Journal of Informetrics, Elsevier, vol. 5(3), pages 481-485.
    37. Lin Zhang & Wolfgang Glänzel, 2012. "Where demographics meets scientometrics: towards a dynamic career analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 617-630, May.
    38. Ludo Waltman & Nees Jan van Eck, 2012. "The inconsistency of the h-index," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 406-415, February.
    39. Dag W. Aksnes, 2003. "A macro study of self-citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(2), pages 235-246, February.
    40. Sune Lehmann & Andrew D. Jackson & Benny E. Lautrup, 2008. "A quantitative analysis of indicators of scientific performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 76(2), pages 369-390, August.
    41. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    42. Wolfgang Glänzel & Koenraad Debackere & Bart Thijs & András Schubert, 2006. "A concise review on the role of author self-citations in information science, bibliometrics and science policy," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(2), pages 263-277, May.
    43. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2011. "Assessing the varying level of impact measurement accuracy as a function of the citation window length," Journal of Informetrics, Elsevier, vol. 5(4), pages 659-667.
    44. Félix Moya-Anegón & Vicente P. Guerrero-Bote & Lutz Bornmann & Henk F. Moed, 2013. "The research guarantors of scientific papers and the output counting: a promising new approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 421-434, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Jianhua Hou & Xiucai Yang & Chaomei Chen, 2018. "Emerging trends and new developments in information science: a document co-citation analysis (2009–2016)," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 869-892, May.
    3. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    4. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    5. Jonas Lindahl & Cristian Colliander & Rickard Danell, 2020. "Early career performance and its correlation with gender and publication output during doctoral education," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 309-330, January.
    6. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    7. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    8. Brito, Ricardo & Navarro, Alonso Rodríguez, 2021. "The inconsistency of h-index: A mathematical analysis," Journal of Informetrics, Elsevier, vol. 15(1).
    9. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A comparison of university performance scores and ranks by MNCS and FSS," Journal of Informetrics, Elsevier, vol. 10(4), pages 889-901.
    10. Jinseok Kim & Jason Owen-Smith, 2021. "ORCID-linked labeled data for evaluating author name disambiguation at scale," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(3), pages 2057-2083, March.
    11. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    12. Andersen, Jens Peter, 2017. "An empirical and theoretical critique of the Euclidean index," Journal of Informetrics, Elsevier, vol. 11(2), pages 455-465.
    13. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    14. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    15. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    16. Wang, Jian, 2016. "Knowledge creation in collaboration networks: Effects of tie configuration," Research Policy, Elsevier, vol. 45(1), pages 68-80.
    17. Guillaume Cabanac & Gilles Hubert & Béatrice Milard, 2015. "Academic careers in Computer Science: continuance and transience of lifetime co-authorships," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 135-150, January.
    18. Loet Leydesdorff & Lutz Bornmann & Tobias Opthof, 2019. "hα: the scientist as chimpanzee or bonobo," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1163-1166, March.
    19. Vîiu, Gabriel-Alexandru, 2017. "Disaggregated research evaluation through median-based characteristic scores and scales: a comparison with the mean-based approach," Journal of Informetrics, Elsevier, vol. 11(3), pages 748-765.
    20. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:98:y:2014:i:1:d:10.1007_s11192-013-1161-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.