IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v13y2019i1p419-433.html
   My bibliography  Save this article

Using Scopus’s CiteScore for assessing the quality of computer science conferences

Author

Listed:
  • Meho, Lokman I.

Abstract

Publication, hiring, promotion, tenure, and funding decisions in computer science often depend on an accurate assessment of the quality of conferences. This study reviews relevant literature and tests Scopus’s CiteScore database and method for evaluating the quality of 395 conferences in the field.

Suggested Citation

  • Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.
  • Handle: RePEc:eee:infome:v:13:y:2019:i:1:p:419-433
    DOI: 10.1016/j.joi.2019.02.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157718304176
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2019.02.006?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lokman I. Meho & Yvonne Rogers, 2008. "Citation counting, citation ranking, and h‐index of human‐computer interaction researchers: A comparison of Scopus and Web of Science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1711-1726, September.
    2. Vahid Garousi & João M. Fernandes, 2017. "Quantity versus impact of software engineering papers: a quantitative study," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(2), pages 963-1006, August.
    3. George Vrettas & Mark Sanderson, 2015. "Conferences versus journals in computer science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2674-2684, December.
    4. Yifan Qian & Wenge Rong & Nan Jiang & Jie Tang & Zhang Xiong, 2017. "Citation regression analysis of computer science publications in different ranking categories and subfields," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1351-1374, March.
    5. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    6. Loizides, Orestis-Stavros & Koutsakis, Polychronis, 2017. "On evaluating the quality of a computer science/computer engineering conference," Journal of Informetrics, Elsevier, vol. 11(2), pages 541-552.
    7. Waister Silva Martins & Marcos André Gonçalves & Alberto H. F. Laender & Nivio Ziviani, 2010. "Assessing the quality of scientific conferences based on bibliographic citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(1), pages 133-155, April.
    8. Lokman I. Meho & Kiduk Yang, 2007. "Impact of data sources on citation counts and rankings of LIS faculty: Web of science versus scopus and google scholar," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(13), pages 2105-2125, November.
    9. Jaime A. Teixeira da Silva & Aamir Raoof Memon, 2017. "CiteScore: A cite for sore eyes, or a valuable, transparent metric?," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 553-556, April.
    10. Peep Küngas & Siim Karus & Svitlana Vakulenko & Marlon Dumas & Cristhian Parra & Fabio Casati, 2013. "Reverse-engineering conference rankings: what does it take to make a reputable conference?," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 651-665, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Marcelo Mendoza, 2021. "Differences in Citation Patterns across Areas, Article Types and Age Groups of Researchers," Publications, MDPI, vol. 9(4), pages 1-23, October.
    2. Pooyan Makvandi & Anahita Nodehi & Franklin R. Tay, 2021. "Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences," Publications, MDPI, vol. 9(2), pages 1-5, April.
    3. Jaime A. Teixeira da Silva, 2021. "CiteScore: risk of copy-cat, fake and misleading metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1859-1862, February.
    4. Sun, Zhuanlan & Liu, Sheng & Li, Yiwei & Ma, Chao, 2023. "Expedited editorial decision in COVID-19 pandemic," Journal of Informetrics, Elsevier, vol. 17(1).
    5. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).
    6. Diana Purwitasari & Chastine Fatichah & Surya Sumpeno & Christian Steglich & Mauridhi Hery Purnomo, 2020. "Identifying collaboration dynamics of bipartite author-topic networks with the influences of interest changes," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1407-1443, March.
    7. Paolo Lorenzo Ferrara & Michele La Noce & Gaetano Sciuto, 2023. "Sustainability of Green Building Materials: A Scientometric Review of Geopolymers from a Circular Economy Perspective," Sustainability, MDPI, vol. 15(22), pages 1-27, November.
    8. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Danielle H. Lee, 2019. "Predictive power of conference-related factors on citation rates of conference papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 281-304, January.
    2. Xiancheng Li & Wenge Rong & Haoran Shi & Jie Tang & Zhang Xiong, 2018. "The impact of conference ranking systems in computer science: a comparative regression analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 879-907, August.
    3. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    4. Marcelo Mendoza, 2021. "Differences in Citation Patterns across Areas, Article Types and Age Groups of Researchers," Publications, MDPI, vol. 9(4), pages 1-23, October.
    5. García-Pérez, Miguel A., 2011. "Strange attractors in the Web of Science database," Journal of Informetrics, Elsevier, vol. 5(1), pages 214-218.
    6. Loizides, Orestis-Stavros & Koutsakis, Polychronis, 2017. "On evaluating the quality of a computer science/computer engineering conference," Journal of Informetrics, Elsevier, vol. 11(2), pages 541-552.
    7. Omar Mubin & Abdullah Al Mahmud & Muneeb Ahmad, 2017. "HCI down under: reflecting on a decade of the OzCHI conference," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(1), pages 367-382, July.
    8. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    9. García-Pérez, Miguel A., 2012. "An extension of the h index that covers the tail and the top of the citation curve and allows ranking researchers with similar h," Journal of Informetrics, Elsevier, vol. 6(4), pages 689-699.
    10. Vinicius da Silva Almendra & Denis Enăchescu & Cornelia Enăchescu, 2015. "Ranking computer science conferences using self-organizing maps with dynamic node splitting," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(1), pages 267-283, January.
    11. Judit Bar-Ilan, 2010. "Citations to the “Introduction to informetrics” indexed by WOS, Scopus and Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 82(3), pages 495-506, March.
    12. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    13. Bornmann, Lutz & Marx, Werner & Schier, Hermann & Rahm, Erhard & Thor, Andreas & Daniel, Hans-Dieter, 2009. "Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published els," Journal of Informetrics, Elsevier, vol. 3(1), pages 27-35.
    14. Carolin Michels & Jun-Ying Fu, 2014. "Systematic analysis of coverage and usage of conference proceedings in web of science," Scientometrics, Springer;Akadémiai Kiadó, vol. 100(2), pages 307-327, August.
    15. António Correia & Hugo Paredes & Benjamim Fonseca, 2018. "Scientometric analysis of scientific publications in CSCW," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(1), pages 31-89, January.
    16. Omar Mubin & Fady Alnajjar & Abdullah Shamail & Suleman Shahid & Simeon Simoff, 2021. "The new norm: Computer Science conferences respond to COVID-19," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1813-1827, February.
    17. Christoph Bartneck, 2017. "Reviewers’ scores do not predict impact: bibliometric analysis of the proceedings of the human–robot interaction conference," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 179-194, January.
    18. Massimo Franceschet, 2010. "A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar," Scientometrics, Springer;Akadémiai Kiadó, vol. 83(1), pages 243-258, April.
    19. Elizabeth S. Vieira & José A. N. F. Gomes, 2009. "A comparison of Scopus and Web of Science for a typical university," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(2), pages 587-600, November.
    20. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:13:y:2019:i:1:p:419-433. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.