IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v11y2017i3p748-765.html
   My bibliography  Save this article

Disaggregated research evaluation through median-based characteristic scores and scales: a comparison with the mean-based approach

Author

Listed:
  • Vîiu, Gabriel-Alexandru

Abstract

Characteristic scores and scales (CSS) were proposed in the late 1980s as a powerful tool in evaluative scientometrics but have only recently begun to be used for systematic, multi-level appraisal. By relying on successive sample means found in citation distributions the CSS method yields performance classes that can be used to benchmark individual units of assessment. This article investigates the theoretical and empirical consequences of a median-based approach to the construction of CSS. Mean and median-based CSS algorithms developed in the R language and environment for statistical computing are applied to citation data of papers from journals indexed in four Web of Science categories: Information Science and Library Science, Social work, Microscopy and Thermodynamics. Subject category-level and journal-level comparisons highlight the specificities of the median-based approach relative to the mean-based CSS. When moving from the latter to the former substantially fewer papers are ascribed to the poorly cited CSS class and more papers become fairly, remarkably or outstandingly cited. This transition is also marked by the well-known “Matthew effect” in science. Both CSS versions promote a disaggregated perspective on research evaluation but differ with regard to emphasis: mean-based CSS promote a more exclusive view of excellence; the median-based approach promotes a more inclusive outlook.

Suggested Citation

  • Vîiu, Gabriel-Alexandru, 2017. "Disaggregated research evaluation through median-based characteristic scores and scales: a comparison with the mean-based approach," Journal of Informetrics, Elsevier, vol. 11(3), pages 748-765.
  • Handle: RePEc:eee:infome:v:11:y:2017:i:3:p:748-765
    DOI: 10.1016/j.joi.2017.04.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157717300901
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2017.04.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans‐Dieter Daniel, 2008. "Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(5), pages 830-837, March.
    2. Pedro Albarrán & Javier Ruiz‐Castillo, 2011. "References made and citations received by scientific articles," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(1), pages 40-49, January.
    3. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "University citation distributions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2790-2804, November.
    4. Pedro Albarrán & Juan A. Crespo & Ignacio Ortuño & Javier Ruiz-Castillo, 2011. "The skewness of science in 219 sub-fields and a number of aggregates," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(2), pages 385-397, August.
    5. L. Egghe, 2010. "Characteristic scores and scales in a Lotkaian framework," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 455-462, May.
    6. Smolinsky, Lawrence, 2016. "Expected number of citations and the crown indicator," Journal of Informetrics, Elsevier, vol. 10(1), pages 43-47.
    7. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    8. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2015. "Evaluating university research: Same performance indicator, different rankings," Journal of Informetrics, Elsevier, vol. 9(3), pages 514-525.
    9. Ruiz-Castillo, Javier & Costas, Rodrigo, 2014. "The skewness of scientific productivity," Journal of Informetrics, Elsevier, vol. 8(4), pages 917-934.
    10. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S. & van Raan, Anthony F.J., 2011. "Towards a new crown indicator: Some theoretical considerations," Journal of Informetrics, Elsevier, vol. 5(1), pages 37-47.
    11. Thed N. Van Leeuwen & Martijn S. Visser & Henk F. Moed & Ton J. Nederhof & Anthony F. J. Van Raan, 2003. "The Holy Grail of science policy: Exploring and combining bibliometric tools in search of scientific excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 57(2), pages 257-280, June.
    12. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    13. Li, Yunrong & Radicchi, Filippo & Castellano, Claudio & Ruiz-Castillo, Javier, 2013. "Quantitative evaluation of alternative field normalization procedures," Journal of Informetrics, Elsevier, vol. 7(3), pages 746-755.
    14. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    15. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    16. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    17. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.
    18. Rodrigo Costas & Thed N. van Leeuwen & María Bordons, 2010. "A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 61(8), pages 1564-1581, August.
    19. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    20. Henk F. Moed & Gali Halevi, 2015. "Multidimensional assessment of scholarly research impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(10), pages 1988-2002, October.
    21. Egghe, L., 2010. "Characteristic scores and scales based on h-type indices," Journal of Informetrics, Elsevier, vol. 4(1), pages 14-22.
    22. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "Towards a new crown indicator: an empirical analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 467-481, June.
    23. Wolfgang Glänzel & Bart Thijs & Koenraad Debackere, 2014. "The application of citation-based performance classes to the disciplinary and multidisciplinary assessment in national comparison and institutional research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 939-952, November.
    24. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    25. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    26. Michael C. Calver & J. Stuart Bradley, 2009. "Should we use the mean citations per paper to summarise a journal’s impact or to rank journals in the same field?," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 611-615, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alonso Rodríguez-Navarro & Ricardo Brito, 2019. "Probability and expected frequency of breakthroughs: basis and use of a robust method of research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 213-235, April.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Vîiu, Gabriel-Alexandru, 2018. "The lognormal distribution explains the remarkable pattern documented by characteristic scores and scales in scientometrics," Journal of Informetrics, Elsevier, vol. 12(2), pages 401-415.
    3. Ruiz-Castillo, Javier & Costas, Rodrigo, 2014. "The skewness of scientific productivity," Journal of Informetrics, Elsevier, vol. 8(4), pages 917-934.
    4. Ruiz-Castillo, Javier & Costas, Rodrigo, 2018. "Individual and field citation distributions in 29 broad scientific fields," Journal of Informetrics, Elsevier, vol. 12(3), pages 868-892.
    5. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    6. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    7. Marcel Clermont & Johanna Krolak & Dirk Tunger, 2021. "Does the citation period have any effect on the informative value of selected citation indicators in research evaluations?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1019-1047, February.
    8. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "University citation distributions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(11), pages 2790-2804, November.
    9. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    10. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    11. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    12. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    13. Giancarlo Ruocco & Cinzia Daraio, 2013. "An empirical approach to compare the performance of heterogeneous academic fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 601-625, December.
    14. Georgios Stoupas & Antonis Sidiropoulos & Antonia Gogoglou & Dimitrios Katsaros & Yannis Manolopoulos, 2018. "Rainbow ranking: an adaptable, multidimensional ranking method for publication sets," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 147-160, July.
    15. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    16. Bouyssou, Denis & Marchant, Thierry, 2014. "An axiomatic approach to bibliometric rankings and indices," Journal of Informetrics, Elsevier, vol. 8(3), pages 449-477.
    17. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    18. Zhihui Zhang & Ying Cheng & Nian Cai Liu, 2015. "Improving the normalization effect of mean-based method from the perspective of optimization: optimization-based linear methods and their performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 587-607, January.
    19. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    20. Andrea Bonaccorsi & Tindaro Cicero & Peter Haddawy & Saeed-UL Hassan, 2017. "Explaining the transatlantic gap in research excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 217-241, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:3:p:748-765. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.