IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v96y2013i2d10.1007_s11192-012-0938-8.html
   My bibliography  Save this article

Reverse-engineering conference rankings: what does it take to make a reputable conference?

Author

Listed:
  • Peep Küngas

    (University of Tartu)

  • Siim Karus

    (University of Tartu)

  • Svitlana Vakulenko

    (University of Tartu)

  • Marlon Dumas

    (University of Tartu)

  • Cristhian Parra

    (University of Trento)

  • Fabio Casati

    (University of Trento)

Abstract

In recent years, several national and community-driven conference rankings have been compiled. These rankings are often taken as indicators of reputation and used for a variety of purposes, such as evaluating the performance of academic institutions and individual scientists, or selecting target conferences for paper submissions. Current rankings are based on a combination of objective criteria and subjective opinions that are collated and reviewed through largely manual processes. In this setting, the aim of this paper is to shed light into the following question: to what extent existing conference rankings reflect objective criteria, specifically submission and acceptance statistics and bibliometric indicators? The paper specifically considers three conference rankings in the field of Computer Science: an Australian national ranking, a Brazilian national ranking and an informal community-built ranking. It is found that in all cases bibliometric indicators are the most important determinants of rank. It is also found that in all rankings, top-tier conferences can be identified with relatively high accuracy through acceptance rates and bibliometric indicators. On the other hand, acceptance rates and bibliometric indicators fail to discriminate between mid-tier and bottom-tier conferences.

Suggested Citation

  • Peep Küngas & Siim Karus & Svitlana Vakulenko & Marlon Dumas & Cristhian Parra & Fabio Casati, 2013. "Reverse-engineering conference rankings: what does it take to make a reputable conference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 651-665, August.
  • Handle: RePEc:spr:scient:v:96:y:2013:i:2:d:10.1007_s11192-012-0938-8
    DOI: 10.1007/s11192-012-0938-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-012-0938-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-012-0938-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Michael Eckmann & Anderson Rocha & Jacques Wainer, 2012. "Relationship between high-quality journals and conferences in computer vision," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 617-630, February.
    2. Waister Silva Martins & Marcos André Gonçalves & Alberto H. F. Laender & Nivio Ziviani, 2010. "Assessing the quality of scientific conferences based on bibliographic citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(1), pages 133-155, April.
    3. Chen, P. & Xie, H. & Maslov, S. & Redner, S., 2007. "Finding scientific gems with Google’s PageRank algorithm," Journal of Informetrics, Elsevier, vol. 1(1), pages 8-15.
    4. Leo Egghe, 2006. "Theory and practise of the g-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 69(1), pages 131-152, October.
    5. Vanclay, Jerome K., 2011. "An evaluation of the Australian Research Council's journal ranking," Journal of Informetrics, Elsevier, vol. 5(2), pages 265-274.
    6. Pablo Jensen & Jean-Baptiste Rouquier & Yves Croissant, 2009. "Testing bibliometric indicators by their prediction of scientists promotions," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(3), pages 467-479, March.
    7. Sherif Sakr & Mohammad Alomari, 2012. "A decade of database conferences: a look inside the program committees," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(1), pages 173-184, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    2. Loizides, Orestis-Stavros & Koutsakis, Polychronis, 2017. "On evaluating the quality of a computer science/computer engineering conference," Journal of Informetrics, Elsevier, vol. 11(2), pages 541-552.
    3. Vinicius da Silva Almendra & Denis Enăchescu & Cornelia Enăchescu, 2015. "Ranking computer science conferences using self-organizing maps with dynamic node splitting," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 267-283, January.
    4. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    2. Lin Zhang & Wolfgang Glänzel, 2012. "Where demographics meets scientometrics: towards a dynamic career analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 617-630, May.
    3. Fuli Zhang, 2017. "Evaluating journal impact based on weighted citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1155-1169, November.
    4. Johan Bollen & Herbert Van de Sompel & Aric Hagberg & Ryan Chute, 2009. "A Principal Component Analysis of 39 Scientific Impact Measures," PLOS ONE, Public Library of Science, vol. 4(6), pages 1-11, June.
    5. Bai, Xiaomei & Zhang, Fuli & Liu, Jiaying & Xia, Feng, 2023. "Quantifying the impact of scientific collaboration and papers via motif-based heterogeneous networks," Journal of Informetrics, Elsevier, vol. 17(2).
    6. Eleni Fragkiadaki & Georgios Evangelidis, 2016. "Three novel indirect indicators for the assessment of papers and authors based on generations of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 657-694, February.
    7. Dinesh Pradhan & Partha Sarathi Paul & Umesh Maheswari & Subrata Nandi & Tanmoy Chakraborty, 2017. "$$C^3$$ C 3 -index: a PageRank based multi-faceted metric for authors’ performance measurement," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 253-273, January.
    8. Hao Wang & Hua-Wei Shen & Xue-Qi Cheng, 2016. "Scientific credit diffusion: Researcher level or paper level?," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 827-837, November.
    9. Zhou, Yuhao & Wang, Ruijie & Zeng, An & Zhang, Yi-Cheng, 2020. "Identifying prize-winning scientists by a competition-aware ranking," Journal of Informetrics, Elsevier, vol. 14(3).
    10. Xiaorui Jiang & Xiaoping Sun & Hai Zhuge, 2013. "Graph-based algorithms for ranking researchers: not all swans are white!," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 743-759, September.
    11. Zhi Li & Qinke Peng & Che Liu, 2016. "Two citation-based indicators to measure latent referential value of papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1299-1313, September.
    12. Xiaomei Bai & Fuli Zhang & Jinzhou Li & Zhong Xu & Zeeshan Patoli & Ivan Lee, 2021. "Quantifying scientific collaboration impact by exploiting collaboration-citation network," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 7993-8008, September.
    13. Fen Zhao & Yi Zhang & Jianguo Lu & Ofer Shai, 2019. "Measuring academic influence using heterogeneous author-citation networks," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1119-1140, March.
    14. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    15. Daniela Godoy & Alejandro Zunino & Cristian Mateos, 2015. "Publication practices in the Argentinian Computer Science community: a bibliometric perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1795-1814, February.
    16. Marcel Clermont & Alexander Dirksen & Barbara Scheidt & Dirk Tunger, 2017. "Citation metrics as an additional indicator for evaluating research performance? An analysis of their correlations and validity," Business Research, Springer;German Academic Association for Business Research, vol. 10(2), pages 249-279, October.
    17. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2018. "Author ranking evaluation at scale," Journal of Informetrics, Elsevier, vol. 12(3), pages 679-702.
    18. Dunaiski, Marcel & Visser, Willem & Geldenhuys, Jaco, 2016. "Evaluating paper and author ranking algorithms using impact and contribution awards," Journal of Informetrics, Elsevier, vol. 10(2), pages 392-407.
    19. Giovanni Abramo & Ciriaco Andrea D’Angelo & Fulvio Viel, 2013. "The suitability of h and g indexes for measuring the research performance of institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 555-570, December.
    20. Upul Senanayake & Mahendra Piraveenan & Albert Zomaya, 2015. "The Pagerank-Index: Going beyond Citation Counts in Quantifying Scientific Impact of Researchers," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-34, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:96:y:2013:i:2:d:10.1007_s11192-012-0938-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.