IDEAS home Printed from https://ideas.repec.org/a/gam/jpubli/v9y2021i2p16-d541269.html
   My bibliography  Save this article

Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences

Author

Listed:
  • Pooyan Makvandi

    (Centre for Materials Interfaces, Istituto Italiano di Tecnologia, Viale Rinaldo Piaggio 34, 56025 Pontedera, Italy)

  • Anahita Nodehi

    (Department of Statistics, Computer Science, Applications (DiSIA), Florence University, Viale Morgagni 59, 50134 Florence, Italy)

  • Franklin R. Tay

    (The Graduate School, Augusta University, Augusta, GA 30912, USA)

Abstract

Academic conferences offer scientists the opportunity to share their findings and knowledge with other researchers. However, the number of conferences is rapidly increasing globally and many unsolicited e-mails are received from conference organizers. These e-mails take time for researchers to read and ascertain their legitimacy. Because not every conference is of high quality, there is a need for young researchers and scholars to recognize the so-called “predatory conferences” which make a profit from unsuspecting researchers without the core purpose of advancing science or collaboration. Unlike journals that possess accreditation indices, there is no appropriate accreditation for international conferences. Here, a bibliometric measure is proposed that enables scholars to evaluate conference quality before attending.

Suggested Citation

  • Pooyan Makvandi & Anahita Nodehi & Franklin R. Tay, 2021. "Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences," Publications, MDPI, vol. 9(2), pages 1-5, April.
  • Handle: RePEc:gam:jpubli:v:9:y:2021:i:2:p:16-:d:541269
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2304-6775/9/2/16/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2304-6775/9/2/16/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    2. Jaime A. Teixeira da Silva & Aamir Raoof Memon, 2017. "CiteScore: A cite for sore eyes, or a valuable, transparent metric?," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 553-556, April.
    3. Jeffrey Beall, 2012. "Predatory publishers are corrupting open access," Nature, Nature, vol. 489(7415), pages 179-179, September.
    4. Abramo, Giovanni & D’Angelo, Ciriaco Andrea & Felici, Giovanni, 2019. "Predicting publication long-term impact through a combination of early citations and journal impact factor," Journal of Informetrics, Elsevier, vol. 13(1), pages 32-49.
    5. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.
    6. Lutz Bornmann & Hans‐Dieter Daniel, 2007. "What do we know about the h index?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(9), pages 1381-1385, July.
    7. Walters, William H., 2017. "Do subjective journal ratings represent whole journals or typical articles? Unweighted or weighted citation impact?," Journal of Informetrics, Elsevier, vol. 11(3), pages 730-744.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Boontarika Paphawasit & Ratapol Wudhikarn, 2022. "Investigating Patterns of Research Collaboration and Citations in Science and Technology: A Case of Chiang Mai University," Administrative Sciences, MDPI, vol. 12(2), pages 1-28, June.
    2. Libor Ansorge & Klára Ansorgeová & Mark Sixsmith, 2021. "Plagiarism through Paraphrasing Tools—The Story of One Plagiarized Text," Publications, MDPI, vol. 9(4), pages 1-10, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    2. Ana Teresa Santos & Sandro Mendonça, 2022. "Do papers (really) match journals’ “aims and scope”? A computational assessment of innovation studies," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7449-7470, December.
    3. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    4. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).
    5. Mingkun Wei, 2020. "Research on impact evaluation of open access journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1027-1049, February.
    6. Yves Fassin, 2021. "Does the Financial Times FT50 journal list select the best management and economics journals?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5911-5943, July.
    7. Judit Dobránszki & Jaime A. Teixeira da Silva, 2019. "Corrective factors for author- and journal-based metrics impacted by citations to accommodate for retractions," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 387-398, October.
    8. Abramo, Giovanni & D'Angelo, Ciriaco Andrea & Di Costa, Flavia, 2021. "The scholarly impact of private sector research: A multivariate analysis," Journal of Informetrics, Elsevier, vol. 15(3).
    9. Zoltán Krajcsák, 2021. "Researcher Performance in Scopus Articles ( RPSA ) as a New Scientometric Model of Scientific Output: Tested in Business Area of V4 Countries," Publications, MDPI, vol. 9(4), pages 1-23, October.
    10. Aniruddha Maiti & Sai Shi & Slobodan Vucetic, 2023. "An ablation study on the use of publication venue quality to rank computer science departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4197-4218, August.
    11. Cabrerizo, F.J. & Alonso, S. & Herrera-Viedma, E. & Herrera, F., 2010. "q2-Index: Quantitative and qualitative evaluation based on the number and impact of papers in the Hirsch core," Journal of Informetrics, Elsevier, vol. 4(1), pages 23-28.
    12. Rey-Martí, Andrea & Ribeiro-Soriano, Domingo & Palacios-Marqués, Daniel, 2016. "A bibliometric analysis of social entrepreneurship," Journal of Business Research, Elsevier, vol. 69(5), pages 1651-1655.
    13. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Costa, 2023. "Correlating article citedness and journal impact: an empirical investigation by field on a large-scale dataset," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1877-1894, March.
    14. Pascal Bador & Thierry Lafouge, 2010. "Comparative analysis between impact factor and h-index for pharmacology and psychiatry journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(1), pages 65-79, July.
    15. Woeginger, Gerhard J., 2008. "An axiomatic analysis of Egghe’s g-index," Journal of Informetrics, Elsevier, vol. 2(4), pages 364-368.
    16. Amin Mazloumian, 2012. "Predicting Scholars' Scientific Impact," PLOS ONE, Public Library of Science, vol. 7(11), pages 1-5, November.
    17. Mohamed Boufarss & Mikael Laakso, 2020. "Open Sesame? Open access priorities, incentives, and policies among higher education institutions in the United Arab Emirates," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1553-1577, August.
    18. Zsolt Kohus & Márton Demeter & László Kun & Eszter Lukács & Katalin Czakó & Gyula Péter Szigeti, 2022. "A Study of the Relation between Byline Positions of Affiliated/Non-Affiliated Authors and the Scientific Impact of European Universities in Times Higher Education World University Rankings," Sustainability, MDPI, vol. 14(20), pages 1-14, October.
    19. Hajar Sotudeh & Zahra Ghasempour & Maryam Yaghtin, 2015. "The citation advantage of author-pays model: the case of Springer and Elsevier OA journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(2), pages 581-608, August.
    20. Anqi Ma & Yu Liu & Xiujuan Xu & Tao Dong, 2021. "A deep-learning based citation count prediction model with paper metadata semantic features," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6803-6823, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jpubli:v:9:y:2021:i:2:p:16-:d:541269. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.