IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v8y2020i6p956-d370058.html
   My bibliography  Save this article

Ranking Multi-Metric Scientific Achievements Using a Concept of Pareto Optimality

Author

Listed:
  • Shahryar Rahnamayan

    (Nature Inspired Computational Intelligence (NICI) Lab, Department of Electrical, Computer, and Software Engineering, Ontario Tech University, Oshawa, ON L1G 0C5, Canada)

  • Sedigheh Mahdavi

    (Nature Inspired Computational Intelligence (NICI) Lab, Department of Electrical, Computer, and Software Engineering, Ontario Tech University, Oshawa, ON L1G 0C5, Canada)

  • Kalyanmoy Deb

    (Department of Electrical and Computer Engineering, Michigan State University (MSU), East Lansing, MI 48824, USA)

  • Azam Asilian Bidgoli

    (Nature Inspired Computational Intelligence (NICI) Lab, Department of Electrical, Computer, and Software Engineering, Ontario Tech University, Oshawa, ON L1G 0C5, Canada)

Abstract

The ranking of multi-metric scientific achievements is a challenging task. For example, the scientific ranking of researchers utilizes two major types of indicators; namely, number of publications and citations. In fact, they focus on how to select proper indicators, considering only one indicator or combination of them. The majority of ranking methods combine several indicators, but these methods are faced with a challenging concern—the assignment of suitable/optimal weights to the targeted indicators. Pareto optimality is defined as a measure of efficiency in the multi-objective optimization which seeks the optimal solutions by considering multiple criteria/objectives simultaneously. The performance of the basic Pareto dominance depth ranking strategy decreases by increasing the number of criteria (generally speaking, when it is more than three criteria). In this paper, a new, modified Pareto dominance depth ranking strategy is proposed which uses some dominance metrics obtained from the basic Pareto dominance depth ranking and some sorted statistical metrics to rank the scientific achievements. It attempts to find the clusters of compared data by using all of indicators simultaneously. Furthermore, we apply the proposed method to address the multi-source ranking resolution problem which is very common these days; for example, there are several world-wide institutions which rank the world’s universities every year, but their rankings are not consistent. As our case studies, the proposed method was used to rank several scientific datasets (i.e., researchers, universities, and countries) for proof of concept.

Suggested Citation

  • Shahryar Rahnamayan & Sedigheh Mahdavi & Kalyanmoy Deb & Azam Asilian Bidgoli, 2020. "Ranking Multi-Metric Scientific Achievements Using a Concept of Pareto Optimality," Mathematics, MDPI, vol. 8(6), pages 1-46, June.
  • Handle: RePEc:gam:jmathe:v:8:y:2020:i:6:p:956-:d:370058
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/8/6/956/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/8/6/956/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Catherine Dehon & Alice McCathie & Vincenzo Verardi, 2010. "Uncovering excellence in academic rankings: a closer look at the Shanghai ranking," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(2), pages 515-524, May.
    2. Ludo Waltman & Clara Calero‐Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan van Eck & Thed N. van Leeuwen & Anthony F.J. van Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    3. Sidiropoulos, A. & Gogoglou, A. & Katsaros, D. & Manolopoulos, Y., 2016. "Gazing at the skyline for star scientists," Journal of Informetrics, Elsevier, vol. 10(3), pages 789-813.
    4. Murat Perit Çakır & Cengiz Acartürk & Oğuzhan Alaşehir & Canan Çilingir, 2015. "A comparative analysis of global and national university ranking systems," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(3), pages 813-848, June.
    5. Veljko Jeremic & Milica Bulajic & Milan Martic & Zoran Radojicic, 2011. "A fresh approach to evaluating the academic ranking of world universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(3), pages 587-596, June.
    6. Pablo D. Batista & Mônica G. Campiteli & Osame Kinouchi, 2006. "Is it possible to compare researchers with different scientific interests?," Scientometrics, Springer;Akadémiai Kiadó, vol. 68(1), pages 179-189, July.
    7. Themis Lazaridis, 2010. "Ranking university departments using the mean h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(2), pages 211-216, February.
    8. Altannar Chinchuluun & Panos Pardalos, 2007. "A survey of recent developments in multiobjective optimization," Annals of Operations Research, Springer, vol. 154(1), pages 29-50, October.
    9. Thed N. Van Leeuwen & Martijn S. Visser & Henk F. Moed & Ton J. Nederhof & Anthony F. J. Van Raan, 2003. "The Holy Grail of science policy: Exploring and combining bibliometric tools in search of scientific excellence," Scientometrics, Springer;Akadémiai Kiadó, vol. 57(2), pages 257-280, June.
    10. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    11. S. Alonso & F. J. Cabrerizo & E. Herrera-Viedma & F. Herrera, 2010. "hg-index: a new index to characterize the scientific output of researchers based on the h- and g-indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(2), pages 391-400, February.
    12. Isidro F. Aguillo & Judit Bar-Ilan & Mark Levene & José Luis Ortega, 2010. "Comparing university rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(1), pages 243-256, October.
    13. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    14. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    15. Costas, Rodrigo & Bordons, María, 2007. "The h-index: Advantages, limitations and its relation with other bibliometric indicators at the micro level," Journal of Informetrics, Elsevier, vol. 1(3), pages 193-203.
    16. Liming Liang, 2006. "h-index sequence and h-index matrix: Constructions and applications," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 153-159, October.
    17. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    18. Isidro F. Aguillo & Begoña Granadino & José L. Ortega & José A. Prieto, 2006. "Scientific research activity and communication measured with cybermetrics indicators," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 57(10), pages 1296-1302, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    2. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    3. Franceschini, Fiorenzo & Maisano, Domenico, 2010. "The citation triad: An overview of a scientist's publication output based on Ferrers diagrams," Journal of Informetrics, Elsevier, vol. 4(4), pages 503-511.
    4. Fiorenzo Franceschini & Domenico Maisano, 2011. "Bibliometric positioning of scientific manufacturing journals: a comparative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(2), pages 463-485, February.
    5. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    6. Massucci, Francesco Alessandro & Docampo, Domingo, 2019. "Measuring the academic reputation through citation networks via PageRank," Journal of Informetrics, Elsevier, vol. 13(1), pages 185-201.
    7. Bornmann, Lutz & Mutz, Rüdiger & Hug, Sven E. & Daniel, Hans-Dieter, 2011. "A multilevel meta-analysis of studies reporting correlations between the h index and 37 different h index variants," Journal of Informetrics, Elsevier, vol. 5(3), pages 346-359.
    8. Vieira, E.S. & Gomes, J.A.N.F., 2010. "A research impact indicator for institutions," Journal of Informetrics, Elsevier, vol. 4(4), pages 581-590.
    9. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    10. Zhang, Lin & Thijs, Bart & Glänzel, Wolfgang, 2011. "The diffusion of H-related literature," Journal of Informetrics, Elsevier, vol. 5(4), pages 583-593.
    11. Lorna Wildgaard, 2015. "A comparison of 17 author-level bibliometric indicators for researchers in Astronomy, Environmental Science, Philosophy and Public Health in Web of Science and Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 873-906, September.
    12. Pablo Dorta-González & María-Isabel Dorta-González, 2011. "Central indexes to the citation distribution: a complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 729-745, September.
    13. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    14. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.
    15. Serge Galam, 2011. "Tailor based allocations for multiple authorship: a fractional gh-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 365-379, October.
    16. Parul Khurana & Kiran Sharma, 2022. "Impact of h-index on author’s rankings: an improvement to the h-index for lower-ranked authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4483-4498, August.
    17. Fiorenzo Franceschini & Domenico Maisano, 2011. "Criticism on the hg-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 86(2), pages 339-346, February.
    18. Bornmann, Lutz & Marx, Werner, 2012. "HistCite analysis of papers constituting the h index research front," Journal of Informetrics, Elsevier, vol. 6(2), pages 285-288.
    19. Yu Liu & Wei Zuo & Ying Gao & Yanhong Qiao, 2013. "Comprehensive geometrical interpretation of h-type indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 605-615, August.
    20. Kuan, Chung-Huei & Huang, Mu-Hsuan & Chen, Dar-Zen, 2011. "Ranking patent assignee performance by h-index and shape descriptors," Journal of Informetrics, Elsevier, vol. 5(2), pages 303-312.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:8:y:2020:i:6:p:956-:d:370058. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.