IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v123y2020i2d10.1007_s11192-020-03391-y.html
   My bibliography  Save this article

Scholarly event characteristics in four fields of science: a metrics-based analysis

Author

Listed:
  • Said Fathalla

    (University of Bonn
    University of Alexandria)

  • Sahar Vahdati

    (University of Oxford)

  • Christoph Lange

    (RWTH Aachen University
    Fraunhofer FIT)

  • Sören Auer

    (Leibniz University of Hannover
    TIB Leibniz Information Centre for Science and Technology)

Abstract

One of the key channels of scholarly knowledge exchange are scholarly events such as conferences, workshops, symposiums, etc.; such events are especially important and popular in Computer Science, Engineering, and Natural Sciences. However, scholars encounter problems in finding relevant information about upcoming events and statistics on their historic evolution. In order to obtain a better understanding of scholarly event characteristics in four fields of science, we analyzed the metadata of scholarly events of four major fields of science, namely Computer Science, Physics, Engineering, and Mathematics using Scholarly Events Quality Assessment suite, a suite of ten metrics. In particular, we analyzed renowned scholarly events belonging to five sub-fields within Computer Science, namely World Wide Web, Computer Vision, Software Engineering, Data Management, as well as Security and Privacy. This analysis is based on a systematic approach using descriptive statistics as well as exploratory data analysis. The findings are on the one hand interesting to observe the general evolution and success factors of scholarly events; on the other hand, they allow (prospective) event organizers, publishers, and committee members to assess the progress of their event over time and compare it to other events in the same field; and finally, they help researchers to make more informed decisions when selecting suitable venues for presenting their work. Based on these findings, a set of recommendations has been concluded to different stakeholders, involving event organizers, potential authors, proceedings publishers, and sponsors. Our comprehensive dataset of scholarly events of the aforementioned fields is openly available in a semantic format and maintained collaboratively at OpenResearch.org.

Suggested Citation

  • Said Fathalla & Sahar Vahdati & Christoph Lange & Sören Auer, 2020. "Scholarly event characteristics in four fields of science: a metrics-based analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(2), pages 677-705, May.
  • Handle: RePEc:spr:scient:v:123:y:2020:i:2:d:10.1007_s11192-020-03391-y
    DOI: 10.1007/s11192-020-03391-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-020-03391-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-020-03391-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Georgina Guilera & Maite Barrios & Juana Gómez-Benito, 2013. "Meta-analysis in psychology: a bibliometric study," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 943-954, March.
    2. Victoria Bakare & Grant Lewison, 2017. "Country over-citation ratios," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1199-1207, November.
    3. Senator Jeong & Hong-Gee Kim, 2010. "Intellectual structure of biomedical informatics reflected in scholarly events," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 541-551, November.
    4. Simone Diniz Junqueira Barbosa & Milene Selbach Silveira & Isabela Gasparini, 2017. "What publications metadata tell us about the evolution of a scientific community: the case of the Brazilian human–computer interaction conference series," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 275-300, January.
    5. González-Pereira, Borja & Guerrero-Bote, Vicente P. & Moya-Anegón, Félix, 2010. "A new approach to the metric of journals’ scientific prestige: The SJR indicator," Journal of Informetrics, Elsevier, vol. 4(3), pages 379-391.
    6. Hanaa M. H. Alam El-Din & Ahmed Sharaf Eldin & Amro M. S. A. Hanora, 2016. "Bibliometric analysis of Egyptian publications on Hepatitis C virus from PubMed using data mining of an in-house developed database (HCVDBegy)," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(2), pages 895-915, August.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Arthur Lackner & Said Fathalla & Mojtaba Nayyeri & Andreas Behrend & Rainer Manthey & Sören Auer & Jens Lehmann & Sahar Vahdati, 2021. "Analysing the evolution of computer science events leveraging a scholarly knowledge graph: a scientometrics study of top-ranked events in the past decade," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 8129-8151, September.
    2. Sahar Vahdati & Said Fathalla & Christoph Lange & Andreas Behrend & Aysegul Say & Zeynep Say & Sören Auer, 2021. "A comprehensive quality assessment framework for scientific events," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 641-682, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Walters, William H., 2017. "Do subjective journal ratings represent whole journals or typical articles? Unweighted or weighted citation impact?," Journal of Informetrics, Elsevier, vol. 11(3), pages 730-744.
    2. J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia & J. Martinez-Baena, 2012. "On first quartile journals which are not of highest impact," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(3), pages 925-943, March.
    3. Dejian Yu & Wanru Wang & Shuai Zhang & Wenyu Zhang & Rongyu Liu, 2017. "A multiple-link, mutually reinforced journal-ranking model to measure the prestige of journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 521-542, April.
    4. Thor, Andreas & Marx, Werner & Leydesdorff, Loet & Bornmann, Lutz, 2016. "Introducing CitedReferencesExplorer (CRExplorer): A program for reference publication year spectroscopy with cited references standardization," Journal of Informetrics, Elsevier, vol. 10(2), pages 503-515.
    5. Kurubaran Ganasegeran & Chee Peng Hor & Mohd Fadzly Amar Jamil & Purnima Devi Suppiah & Juliana Mohd Noor & Norshahida Abdul Hamid & Deik Roy Chuan & Mohd Rizal Abdul Manaf & Alan Swee Hock Ch’ng & Ir, 2021. "Mapping the Scientific Landscape of Diabetes Research in Malaysia (2000–2018): A Systematic Scientometrics Study," IJERPH, MDPI, vol. 18(1), pages 1-20, January.
    6. Chen, Ying & Koch, Thorsten & Zakiyeva, Nazgul & Liu, Kailiang & Xu, Zhitong & Chen, Chun-houh & Nakano, Junji & Honda, Keisuke, 2023. "Article’s scientific prestige: Measuring the impact of individual articles in the web of science," Journal of Informetrics, Elsevier, vol. 17(1).
    7. Fiala, Dalibor, 2012. "Time-aware PageRank for bibliographic networks," Journal of Informetrics, Elsevier, vol. 6(3), pages 370-388.
    8. Hao Wang & Sanhong Deng & Xinning Su, 2016. "A study on construction and analysis of discipline knowledge structure of Chinese LIS based on CSSCI," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1725-1759, December.
    9. Wolfgang Glänzel & Henk F. Moed, 2013. "Opinion paper: thoughts and facts on bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(1), pages 381-394, July.
    10. Mingkun Wei, 2020. "Research on impact evaluation of open access journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1027-1049, February.
    11. Wolfgang Glänzel & András Schubert & Bart Thijs & Koenraad Debackere, 2011. "A priori vs. a posteriori normalisation of citation indicators. The case of journal ranking," Scientometrics, Springer;Akadémiai Kiadó, vol. 87(2), pages 415-424, May.
    12. Christopher Zou & Jordan B. Peterson, 2016. "Quantifying the scientific output of new researchers using the zp-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 901-916, March.
    13. Rose, Michael E. & Opolot, Daniel C. & Georg, Co-Pierre, 2022. "Discussants," Research Policy, Elsevier, vol. 51(10).
    14. J. A. García & Rosa Rodriguez-Sánchez & J. Fdez-Valdivia, 2014. "The selection of high-quality manuscripts," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 299-313, January.
    15. Moed, Henk F. & de Moya-Anegon, Felix & Guerrero-Bote, Vicente & Lopez-Illescas, Carmen, 2020. "Are nationally oriented journals indexed in Scopus becoming more international? The effect of publication language and access modality," Journal of Informetrics, Elsevier, vol. 14(2).
    16. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    17. Ludo Waltman & Erjia Yan & Nees Jan Eck, 2011. "A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 301-314, October.
    18. Frode Eika Sandnes, 2021. "A bibliometric study of human–computer interaction research activity in the Nordic-Baltic Eight countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4733-4767, June.
    19. Perlin, Marcelo S. & Santos, André A.P. & Imasato, Takeyoshi & Borenstein, Denis & Da Silva, Sergio, 2017. "The Brazilian scientific output published in journals: A study based on a large CV database," Journal of Informetrics, Elsevier, vol. 11(1), pages 18-31.
    20. Zhang, Fang & Wu, Shengli, 2020. "Predicting future influence of papers, researchers, and venues in a dynamic academic network," Journal of Informetrics, Elsevier, vol. 14(2).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:123:y:2020:i:2:d:10.1007_s11192-020-03391-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.