IDEAS home Printed from https://ideas.repec.org/
MyIDEAS: Log in (now much improved!) to save this article

The Herrero-Villar approach to citation impact

Listed author(s):
  • Albarrán, Pedro
  • Herrero, Carmen
  • Ruiz-Castillo, Javier
  • Villar, Antonio

This paper focuses on the evaluation of research institutions in terms of size-independent indicators. There are well-known procedures in this context, such as what we call additive rules, which provide an evaluation of the impact of any research unit in a scientific field based upon a partition of the field citations into ordered categories, along with some external weighting system to weigh those categories. We introduce here a new ranking procedure that is not an additive rule – the HV procedure, after Herrero & Villar (2013) – and compare it those conventional evaluation rules within a common setting. Given a set of ordered categories, the HV procedure measures the performance of the different research units in terms of the relative probability of getting more citations. The HV method also provides a complete, transitive and cardinal evaluation, without recurring to any external weighting scheme. Using a large dataset of publications in 22 scientific fields assigned to 40 countries, we compare the performance of several additive rules – the Relative Citation Rate, four percentile-based ranking procedures, and two average-based high-impact indicators – and the corresponding HV procedures under the same set of ordered categories. Comparisons take into account re-rankings, and differences in the outcome variability, measured by the coefficient of variation, the range, and the ratio between the maximum and minimum index values.

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.

File URL: http://www.sciencedirect.com/science/article/pii/S1751157716303716
Download Restriction: Full text for ScienceDirect subscribers only

As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.

Article provided by Elsevier in its journal Journal of Informetrics.

Volume (Year): 11 (2017)
Issue (Month): 2 ()
Pages: 625-640

as
in new window

Handle: RePEc:eee:infome:v:11:y:2017:i:2:p:625-640
DOI: 10.1016/j.joi.2017.04.008
Contact details of provider: Web page: http://www.elsevier.com/locate/joi

References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:

as
in new window


  1. Caroline S. Wagner & Loet Leydesdorff, 2012. "An Integrated Impact Indicator: A new definition of 'Impact' with policy relevance," Research Evaluation, Oxford University Press, vol. 21(3), pages 183-188, July.
  2. László Kóczy & Alexandru Nichifor, 2013. "The intellectual influence of economic journals: quality versus quantity," Economic Theory, Springer;Society for the Advancement of Economic Theory (SAET), vol. 52(3), pages 863-884, April.
  3. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
  4. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.
  5. Neus Herranz & Javier Ruiz-Castillo, 2012. "Sub-field normalization in the multiplicative case: High- and low-impact citation indicators," Research Evaluation, Oxford University Press, vol. 21(2), pages 113-125, April.
  6. Bornmann, Lutz & Williams, Richard, 2013. "How to calculate the practical significance of citation impact differences? An empirical example from evaluative institutional bibliometrics using adjusted predictions and marginal effects," Journal of Informetrics, Elsevier, vol. 7(2), pages 562-574.
  7. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, 07.
  8. Albarrán, Pedro & Ortuño, Ignacio & Ruiz-Castillo, Javier, 2011. "High- and low-impact citation measures: Empirical applications," Journal of Informetrics, Elsevier, vol. 5(1), pages 122-145.
  9. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
  10. Nicolas CARAYOL & Agenor LAHATTE, 2014. "Dominance relations and ranking when quantity and quality both matter: Applications to US universities and econ. departments worldwide," Cahiers du GREThA 2014-14, Groupe de Recherche en Economie Théorique et Appliquée.
  11. Yunrong Li & Javier Ruiz-Castillo, 2014. "The impact of extreme observations in citation distributions," Research Evaluation, Oxford University Press, vol. 23(2), pages 174-182.
  12. Foster, James & Greer, Joel & Thorbecke, Erik, 1984. "A Class of Decomposable Poverty Measures," Econometrica, Econometric Society, vol. 52(3), pages 761-766, May.
  13. Ronald Rousseau, 2012. "Basic properties of both percentile rank scores and the I3 indicator," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(2), pages 416-420, 02.
  14. Michael Schreiber, 2012. "Inconsistencies of recently proposed citation impact indicators and how to avoid them," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(10), pages 2062-2073, October.
  15. Denis Bouyssou & Thierry Marchant, 2011. "Ranking scientists and departments in a consistent manner," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(9), pages 1761-1769, 09.
  16. Ignacio Palacios-Huerta & Oscar Volij, 2004. "The Measurement of Intellectual Influence," Econometrica, Econometric Society, vol. 72(3), pages 963-977, 05.
  17. Pedro Albarrán & Ignacio Ortuño & Javier Ruiz-Castillo, 2011. "Average-based versus high- and low-impact indicators for the evaluation of scientific distributions," Research Evaluation, Oxford University Press, vol. 20(4), pages 325-339, October.
  18. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, 02.
  19. Giora Slutzki & Oscar Volij, 2006. "Scoring of web pages and tournaments—axiomatizations," Social Choice and Welfare, Springer;The Society for Social Choice and Welfare, vol. 26(1), pages 75-92, January.
  20. Waltman, Ludo & van Eck, Nees Jan, 2015. "Field-normalized citation impact indicators and the choice of an appropriate counting method," Journal of Informetrics, Elsevier, vol. 9(4), pages 872-894.
  21. Albarrán, Pedro & Ortuño, Ignacio & Ruiz-Castillo, Javier, 2011. "The measurement of low- and high-impact in citation distributions: Technical results," Journal of Informetrics, Elsevier, vol. 5(1), pages 48-63.
  22. Antonio Perianes-Rodriguez & Javier Ruiz-Castillo, 2016. "A comparison of two ways of evaluating research units working in different scientific fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 539-561, February.
  23. Federico Echenique & Roland G. Fryer, 2007. "A Measure of Segregation Based on Social Interactions," The Quarterly Journal of Economics, Oxford University Press, vol. 122(2), pages 441-485.
  24. Pavel Yu. Chebotarev & Elena Shamis, 1998. "Characterizations of scoring methodsfor preference aggregation," Annals of Operations Research, Springer, vol. 80(0), pages 299-332, January.
  25. Pedro Albarrán & Antonio Perianes-Rodríguez & Javier Ruiz-Castillo, 2015. "Differences in citation impact across countries," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(3), pages 512-525, 03.
  26. Bouyssou, Denis & Marchant, Thierry, 2014. "An axiomatic approach to bibliometric rankings and indices," Journal of Informetrics, Elsevier, vol. 8(3), pages 449-477.
  27. Laband, David N & Piette, Michael J, 1994. "The Relative Impacts of Economics Journals: 1970-1990," Journal of Economic Literature, American Economic Association, vol. 32(2), pages 640-666, June.
  28. Loet Leydesdorff & Lutz Bornmann, 2012. "Percentile ranks and the integrated impact indicator (I3)," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(9), pages 1901-1902, 09.
  29. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
  30. Giora Slutzki & Oscar Volij, 2005. "Ranking participants in generalized tournaments," International Journal of Game Theory, Springer;Game Theory Society, vol. 33(2), pages 255-270, 06.
  31. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
Full references (including those not matched with items on IDEAS)

This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.

When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:2:p:625-640. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Dana Niculescu)

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.

This information is provided to you by IDEAS at the Research Division of the Federal Reserve Bank of St. Louis using RePEc data.