Ranking scientists and departments in a consistent manner
AbstractThe standard data that we use when computing bibliometric rankings of scientists are just their publication/citation records, i.e., so many papers with 0 citation, so many with 1 citation, so many with 2 citations, etc. The standard data for bibliometric rankings of departments have the same structure. It is therefore tempting (and many authors gave in to temptation) to use the same method for computing rankings of scientists and rankings of departments. Depending on the method, this can yield quite surprising and unpleasant results. Indeed, with some methods, it may happen that the "best" department contains the "worst" scientists, and only them. This problem will not occur if the rankings satisfy a property called consistency, recently introduced in the literature. In this paper, we explore the consequences of consistency and we characterize two families of consistent rankings.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by HAL in its series Post-Print with number hal-00606931.
Date of creation: 2011
Date of revision:
Publication status: Published, Journal of the American Society for Information Science and Technology, 2011, 62, 9, 1761-1769
Note: View the original document on HAL open archive server: http://hal.archives-ouvertes.fr/hal-00606931/en/
Contact details of provider:
Web page: http://hal.archives-ouvertes.fr/
Bibliometrics; ranking of scientists; ranking of departments;
This paper has been announced in the following NEP Reports:
- NEP-ALL-2011-09-22 (All new papers)
- NEP-IPR-2011-09-22 (Intellectual Property Rights)
- NEP-SOG-2011-09-22 (Sociology of Economics)
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- van Eck, N.J.P. & Waltman, L., 2008. "Generalizing the h- and g-indices," ERIM Report Series Research in Management ERS-2008-049-LIS, Erasmus Research Institute of Management (ERIM), ERIM is the joint research institute of the Rotterdam School of Management, Erasmus University and the Erasmus School of Economics (ESE) at Erasmus Uni.
- Kinnucan, Henry W. & Traxler, Greg, 1994. "Ranking Agricultural Economics Departments By Ajae Page Counts: A Reappraisal," Agricultural and Resource Economics Review, Northeastern Agricultural and Resource Economics Association, vol. 23(2), October.
- Richard Dusansky & Clayton J. Vernon, 1998. "Rankings of U.S. Economics Departments," Journal of Economic Perspectives, American Economic Association, vol. 12(1), pages 157-170, Winter.
- Waltman, L. & van Eck, N.J.P., 2009. "A Taxonomy of Bibliometric Performance Indicators Based on the Property of Consistency," ERIM Report Series Research in Management ERS-2009-014-LIS, Erasmus Research Institute of Management (ERIM), ERIM is the joint research institute of the Rotterdam School of Management, Erasmus University and the Erasmus School of Economics (ESE) at Erasmus Uni.
- Juan A. Crespo & Ignacio Ortuño Ortíz & Javier Ruiz-Castillo, 2012. "The citation merit of scientific publications," Economics Working Papers we1136, Universidad Carlos III, Departamento de Economía.
- Marta Cardin & Marco Corazza & Stefania Funari & Silvio Giove, 2011. "A fuzzy-based scoring rule for author ranking," Working Papers 2011_11, Department of Economics, University of Venice "Ca' Foscari".
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (CCSD).
If references are entirely missing, you can add them using this form.