IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v11y2017i2p541-552.html
   My bibliography  Save this article

On evaluating the quality of a computer science/computer engineering conference

Author

Listed:
  • Loizides, Orestis-Stavros
  • Koutsakis, Polychronis

Abstract

The Peer Reputation (PR) metric was recently proposed in the literature, in order to judge a researcher’s contribution through the quality of the venue in which the researcher’s work is published. PR, proposed by Nelakuditi et al., ties the selectivity of a publication venue with the reputation of the first author’s institution. By computing PR for a percentage of the papers accepted in a conference or journal, a more solid indicator of a venue’s selectivity than the paper Acceptance Ratio (AR) can be derived. In recent work we explained the reasons for which we agree that PR offers substantial information that is missing from AR, however we also pointed out several limitations of the metric. These limitations make PR inadequate, if used only on its own, to give a solid evaluation of a researcher’s contribution. In this work, we present our own approach for judging the quality of a Computer Science/Computer Engineering conference venue, and thus, implicitly, the potential quality of a paper accepted in that conference. Driven by our previous findings on the adequacy of PR, as well as our belief that an institution does not necessarily “make” a researcher, we propose a Conference Classification Approach (CCA) that takes into account a number of metrics and factors, in addition to PR. These are the paper’s impact and the authors’ h-indexes. We present and discuss our results, based on data gathered from close to 3000 papers from 12 top-tier Computer Science/Computer Engineering conferences belonging to different research fields. In order to evaluate CCA, we compare our conference rankings against multiple publicly available rankings based on evaluations from the Computer Science/Computer Engineering community, and we show that our approach achieves a very comparable classification.

Suggested Citation

  • Loizides, Orestis-Stavros & Koutsakis, Polychronis, 2017. "On evaluating the quality of a computer science/computer engineering conference," Journal of Informetrics, Elsevier, vol. 11(2), pages 541-552.
  • Handle: RePEc:eee:infome:v:11:y:2017:i:2:p:541-552
    DOI: 10.1016/j.joi.2017.03.008
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157716301808
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2017.03.008?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Abramo, Giovanni & D’Angelo, Ciriaco Andrea, 2016. "A farewell to the MNCS and like size-independent indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 646-651.
    2. Lokman I. Meho & Kiduk Yang, 2007. "Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(13), pages 2105-2125, November.
    3. Vinkler, Péter, 2012. "The case of scientometricians with the “absolute relative” impact indicator," Journal of Informetrics, Elsevier, vol. 6(2), pages 254-264.
    4. James Hartley, 2012. "To cite or not to cite: author self-citations and the impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 313-317, August.
    5. Wolfgang Glänzel & András Schubert, 2003. "A new classification scheme of science fields and subfields designed for scientometric evaluation purposes," Scientometrics, Springer;Akadémiai Kiadó, vol. 56(3), pages 357-367, March.
    6. Peep Küngas & Siim Karus & Svitlana Vakulenko & Marlon Dumas & Cristhian Parra & Fabio Casati, 2013. "Reverse-engineering conference rankings: what does it take to make a reputable conference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 651-665, August.
    7. Anne-Wil Harzing, 2014. "A longitudinal study of Google Scholar coverage between 2012 and 2013," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 565-575, January.
    8. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    2. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    3. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    3. Muhammad Salman & Mohammad Masroor Ahmed & Muhammad Tanvir Afzal, 2021. "Assessment of author ranking indices based on multi-authorship," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 4153-4172, May.
    4. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    5. Parul Khurana & Kiran Sharma, 2022. "Impact of h-index on author’s rankings: an improvement to the h-index for lower-ranked authors," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4483-4498, August.
    6. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.
    7. Alireza Abbasi & Mahdi Jalili & Abolghasem Sadeghi-Niaraki, 2018. "Influence of network-based structural and power diversity on research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 117(1), pages 579-590, October.
    8. Martin-Martin, Alberto & Orduna-Malea, Enrique & Harzing, Anne-Wil & Delgado López-Cózar, Emilio, 2017. "Can we use Google Scholar to identify highly-cited documents?," Journal of Informetrics, Elsevier, vol. 11(1), pages 152-163.
    9. Sergio Copiello, 2019. "The open access citation premium may depend on the openness and inclusiveness of the indexing database, but the relationship is controversial because it is ambiguous where the open access boundary lie," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(2), pages 995-1018, November.
    10. Borenstein, Denis & Perlin, Marcelo S. & Imasato, Takeyoshi, 2022. "The Academic Inbreeding Controversy: Analysis and Evidence from Brazil," Journal of Informetrics, Elsevier, vol. 16(2).
    11. Andreas Thor & Lutz Bornmann & Werner Marx & Rüdiger Mutz, 2018. "Identifying single influential publications in a research field: new analysis opportunities of the CRExplorer," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 591-608, July.
    12. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    13. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    14. Alfio Ferrara & Silvia Salini, 2012. "Ten challenges in modeling bibliographic data for bibliometric analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(3), pages 765-785, December.
    15. Mike Thelwall, 2019. "The influence of highly cited papers on field normalised indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(2), pages 519-537, February.
    16. Filippo Radicchi & Claudio Castellano, 2013. "Analysis of bibliometric indicators for individual scholars in a large data set," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(3), pages 627-637, December.
    17. Marcello D’Agostino & Valentino Dardanoni & Roberto Ghiselli Ricci, 2017. "How to standardize (if you must)," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 825-843, November.
    18. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    19. Alona Zharova & Wolfgang K. Härdle & Stefan Lessmann, 2017. "Is Scientific Performance a Function of Funds?," SFB 649 Discussion Papers SFB649DP2017-028, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    20. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:11:y:2017:i:2:p:541-552. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.