Ranking scientists and departments in a consistent manner
The standard data that we use when computing bibliometric rankings of scientists are just their publication/citation records, i.e., so many papers with 0 citation, so many with 1 citation, so many with 2 citations, etc. The standard data for bibliometric rankings of departments have the same structure. It is therefore tempting (and many authors gave in to temptation) to use the same method for computing rankings of scientists and rankings of departments. Depending on the method, this can yield quite surprising and unpleasant results. Indeed, with some methods, it may happen that the "best" department contains the "worst" scientists, and only them. This problem will not occur if the rankings satisfy a property called consistency, recently introduced in the literature. In this paper, we explore the consequences of consistency and we characterize two families of consistent rankings.
|Date of creation:||2011|
|Date of revision:|
|Publication status:||Published in Journal of the American Society for Information Science and Technology, Association for Information Science and Technology (ASIS&T), 2011, 62 (9), pp.1761-1769. <10.1002/asi.21544>|
|Note:||View the original document on HAL open archive server: https://hal.archives-ouvertes.fr/hal-00606931|
|Contact details of provider:|| Web page: https://hal.archives-ouvertes.fr/|
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- van Eck, N.J.P. & Waltman, L., 2008. "Generalizing the h- and g-indices," ERIM Report Series Research in Management ERS-2008-049-LIS, Erasmus Research Institute of Management (ERIM), ERIM is the joint research institute of the Rotterdam School of Management, Erasmus University and the Erasmus School of Economics (ESE) at Erasmus University Rotterdam.
- Bouyssou, Denis & Marchant, Thierry, 2011. "Bibliometric rankings of journals based on Impact Factors: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 5(1), pages 75-86.
- Waltman, L. & van Eck, N.J.P., 2009. "A Taxonomy of Bibliometric Performance Indicators Based on the Property of Consistency," ERIM Report Series Research in Management ERS-2009-014-LIS, Erasmus Research Institute of Management (ERIM), ERIM is the joint research institute of the Rotterdam School of Management, Erasmus University and the Erasmus School of Economics (ESE) at Erasmus University Rotterdam.
- van Eck, Nees Jan & Waltman, Ludo, 2008. "Generalizing the h- and g-indices," Journal of Informetrics, Elsevier, vol. 2(4), pages 263-271.
- Bouyssou, D. & Marchant, T., 2010. "Consistent bibliometric rankings of authors and of journals," Journal of Informetrics, Elsevier, vol. 4(3), pages 365-378.
- Kinnucan, Henry W. & Traxler, Greg, 1994. "Ranking Agricultural Economics Departments By Ajae Page Counts: A Reappraisal," Agricultural and Resource Economics Review, Northeastern Agricultural and Resource Economics Association, vol. 23(2), October.
- Richard Dusansky & Clayton J. Vernon, 1998. "Rankings of U.S. Economics Departments," Journal of Economic Perspectives, American Economic Association, vol. 12(1), pages 157-170, Winter.
When requesting a correction, please mention this item's handle: RePEc:hal:journl:hal-00606931. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (CCSD)
If references are entirely missing, you can add them using this form.