IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v18y2024i2s1751157724000166.html
   My bibliography  Save this article

Rank analysis of most cited publications, a new approach for research assessments

Author

Listed:
  • Rodríguez-Navarro, Alonso
  • Brito, Ricardo

Abstract

Citation metrics are the best tools for research assessments. However, current metrics may be misleading in research systems that pursue simultaneously different goals, such as to push the boundaries of knowledge or incremental innovations, because their publications have different citation distributions. We estimate the contribution to the progress of knowledge by studying only a limited number of the most cited papers, which are dominated by publications pursuing this progress. To field-normalize the metrics, we substitute the number of citations by the rank position of papers from one country in the global list of papers. Using synthetic series of lognormally distributed numbers, simulating citations, we developed the Rk-index, which is calculated from the global ranks of the 10 highest numbers in each series, and demonstrate its equivalence to the number of papers in top percentiles, Ptop 0.1 % and Ptop 0.01 %. In real cases, the Rk-index is simple and easy to calculate, and evaluates the contribution to the progress of knowledge better than less stringent metrics. Although further research is needed, rank analysis of the most cited papers is a promising approach for research evaluation. It is also demonstrated that, for this purpose, domestic and collaborative papers should be studied independently.

Suggested Citation

  • Rodríguez-Navarro, Alonso & Brito, Ricardo, 2024. "Rank analysis of most cited publications, a new approach for research assessments," Journal of Informetrics, Elsevier, vol. 18(2).
  • Handle: RePEc:eee:infome:v:18:y:2024:i:2:s1751157724000166
    DOI: 10.1016/j.joi.2024.101503
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157724000166
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2024.101503?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    2. V. A. Traag & L. Waltman, 2019. "Systematic analysis of agreement between metrics and peer review in the UK REF," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
    3. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2018. "Double rank analysis for research assessment," Journal of Informetrics, Elsevier, vol. 12(1), pages 31-41.
    4. Alonso Rodríguez-Navarro & Ricardo Brito, 2020. "Like-for-like bibliometric substitutes for peer review: Advantages and limits of indicators calculated from the ep index," Research Evaluation, Oxford University Press, vol. 29(2), pages 215-230.
    5. Neus Herranz & Javier Ruiz-Castillo, 2013. "The end of the “European Paradox”," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 453-464, April.
    6. Ruiz-Castillo, Javier & Waltman, Ludo, 2015. "Field-normalized citation impact indicators using algorithmically constructed classification systems of science," Journal of Informetrics, Elsevier, vol. 9(1), pages 102-117.
    7. Alonso Rodríguez-Navarro & Ricardo Brito, 2022. "The link between countries’ economic and scientific wealth has a complex dependence on technological activity and research policy," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2871-2896, May.
    8. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    9. Jesper W. Schneider & Rodrigo Costas, 2017. "Identifying potential “breakthrough” publications using refined citation analyses: Three related explorative approaches," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(3), pages 709-723, March.
    10. Vîiu, Gabriel-Alexandru, 2018. "The lognormal distribution explains the remarkable pattern documented by characteristic scores and scales in scientometrics," Journal of Informetrics, Elsevier, vol. 12(2), pages 401-415.
    11. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    12. Andrea Bonaccorsi, 2007. "Explaining poor performance of European science: Institutions versus policies," Science and Public Policy, Oxford University Press, vol. 34(5), pages 303-316, June.
    13. Waltman, Ludo & van Eck, Nees Jan, 2015. "Field-normalized citation impact indicators and the choice of an appropriate counting method," Journal of Informetrics, Elsevier, vol. 9(4), pages 872-894.
    14. Ludo Waltman & Nees Jan van Eck & Anthony F. J. van Raan, 2012. "Universality of citation distributions revisited," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(1), pages 72-77, January.
    15. Dosi, Giovanni & Llerena, Patrick & Labini, Mauro Sylos, 2006. "The relationships between science, technologies and their industrial exploitation: An illustration through the myths and realities of the so-called `European Paradox'," Research Policy, Elsevier, vol. 35(10), pages 1450-1464, December.
    16. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2019. "Evaluating research and researchers by the journal impact factor: Is it better than coin flipping?," Journal of Informetrics, Elsevier, vol. 13(1), pages 314-324.
    17. van den Besselaar, Peter & Heyman, Ulf & Sandström, Ulf, 2017. "Perverse effects of output-based research funding? Butler’s Australian case revisited," Journal of Informetrics, Elsevier, vol. 11(3), pages 905-918.
    18. Alonso Rodríguez-Navarro & Francis Narin, 2018. "European Paradox or Delusion—Are European Science and Economy Outdated?," Science and Public Policy, Oxford University Press, vol. 45(1), pages 14-23.
    19. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    20. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    21. Ulrich Schmoch & Torben Schubert & Dorothea Jansen & Richard Heidler & Regina von Görtz, 2010. "How to use indicators to measure scientific performance: a balanced approach," Research Evaluation, Oxford University Press, vol. 19(1), pages 2-18, March.
    22. Katchanov, Yurij L. & Markova, Yulia V. & Shmatko, Natalia A., 2023. "Uncited papers in the structure of scientific communication," Journal of Informetrics, Elsevier, vol. 17(2).
    23. Alonso Rodríguez-Navarro & Ricardo Brito, 2019. "Probability and expected frequency of breakthroughs: basis and use of a robust method of research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 213-235, April.
    24. Felix Poege & Dietmar Harhoff & Fabian Gaessler & Stefano Baruffaldi, 2019. "Science Quality and the Value of Inventions," Papers 1903.05020, arXiv.org, revised Apr 2019.
    25. Ludo Waltman & Nees Jan van Eck & Anthony F. J. van Raan, 2012. "Universality of citation distributions revisited," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(1), pages 72-77, January.
    26. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    27. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Alonso Rodríguez-Navarro & Ricardo Brito, 2019. "Probability and expected frequency of breakthroughs: basis and use of a robust method of research assessment," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 213-235, April.
    2. Brito, Ricardo & Navarro, Alonso Rodríguez, 2021. "The inconsistency of h-index: A mathematical analysis," Journal of Informetrics, Elsevier, vol. 15(1).
    3. Alonso Rodríguez-Navarro & Ricardo Brito, 2022. "The link between countries’ economic and scientific wealth has a complex dependence on technological activity and research policy," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2871-2896, May.
    4. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2018. "Technological research in the EU is less efficient than the world average. EU research policy risks Europeans’ future," Journal of Informetrics, Elsevier, vol. 12(3), pages 718-731.
    5. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    6. Rodríguez-Navarro, Alonso & Brito, Ricardo, 2018. "Double rank analysis for research assessment," Journal of Informetrics, Elsevier, vol. 12(1), pages 31-41.
    7. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    8. Bornmann, Lutz & Marx, Werner, 2018. "Critical rationalism and the search for standard (field-normalized) indicators in bibliometrics," Journal of Informetrics, Elsevier, vol. 12(3), pages 598-604.
    9. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    10. Bonaccorsi, Andrea & Haddawy, Peter & Cicero, Tindaro & Hassan, Saeed-Ul, 2017. "The solitude of stars. An analysis of the distributed excellence model of European universities," Journal of Informetrics, Elsevier, vol. 11(2), pages 435-454.
    11. Ruiz-Castillo, Javier & Costas, Rodrigo, 2018. "Individual and field citation distributions in 29 broad scientific fields," Journal of Informetrics, Elsevier, vol. 12(3), pages 868-892.
    12. Andrea Bonaccorsi & Tindaro Cicero & Peter Haddawy & Saeed-UL Hassan, 2017. "Explaining the transatlantic gap in research excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 217-241, January.
    13. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    14. Alonso Rodríguez-Navarro & Francis Narin, 2018. "European Paradox or Delusion—Are European Science and Economy Outdated?," Science and Public Policy, Oxford University Press, vol. 45(1), pages 14-23.
    15. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    16. Gabriel-Alexandru Vȋiu & Mihai Păunescu, 2021. "The lack of meaningful boundary differences between journal impact factor quartiles undermines their independent use in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1495-1525, February.
    17. Melika Mosleh & Saeed Roshani & Mario Coccia, 2022. "Scientific laws of research funding to support citations and diffusion of knowledge in life science," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1931-1951, April.
    18. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    19. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    20. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:18:y:2024:i:2:s1751157724000166. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.