IDEAS home Printed from https://ideas.repec.org/a/gam/jpubli/v9y2021i2p16-d541269.html
   My bibliography  Save this article

Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences

Author

Listed:
  • Pooyan Makvandi

    (Centre for Materials Interfaces, Istituto Italiano di Tecnologia, Viale Rinaldo Piaggio 34, 56025 Pontedera, Italy)

  • Anahita Nodehi

    (Department of Statistics, Computer Science, Applications (DiSIA), Florence University, Viale Morgagni 59, 50134 Florence, Italy)

  • Franklin R. Tay

    (The Graduate School, Augusta University, Augusta, GA 30912, USA)

Abstract

Academic conferences offer scientists the opportunity to share their findings and knowledge with other researchers. However, the number of conferences is rapidly increasing globally and many unsolicited e-mails are received from conference organizers. These e-mails take time for researchers to read and ascertain their legitimacy. Because not every conference is of high quality, there is a need for young researchers and scholars to recognize the so-called “predatory conferences” which make a profit from unsuspecting researchers without the core purpose of advancing science or collaboration. Unlike journals that possess accreditation indices, there is no appropriate accreditation for international conferences. Here, a bibliometric measure is proposed that enables scholars to evaluate conference quality before attending.

Suggested Citation

  • Pooyan Makvandi & Anahita Nodehi & Franklin R. Tay, 2021. "Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences," Publications, MDPI, vol. 9(2), pages 1-5, April.
  • Handle: RePEc:gam:jpubli:v:9:y:2021:i:2:p:16-:d:541269
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2304-6775/9/2/16/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2304-6775/9/2/16/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Jeffrey Beall, 2012. "Predatory publishers are corrupting open access," Nature, Nature, vol. 489(7415), pages 179-179, September.
    2. Bornmann, Lutz & Williams, Richard, 2017. "Can the journal impact factor be used as a criterion for the selection of junior researchers? A large-scale empirical study based on ResearcherID data," Journal of Informetrics, Elsevier, vol. 11(3), pages 788-799.
    3. Abramo, Giovanni & D’Angelo, Ciriaco Andrea & Felici, Giovanni, 2019. "Predicting publication long-term impact through a combination of early citations and journal impact factor," Journal of Informetrics, Elsevier, vol. 13(1), pages 32-49.
    4. Meho, Lokman I., 2019. "Using Scopus’s CiteScore for assessing the quality of computer science conferences," Journal of Informetrics, Elsevier, vol. 13(1), pages 419-433.
    5. Jaime A. Teixeira da Silva & Aamir Raoof Memon, 2017. "CiteScore: A cite for sore eyes, or a valuable, transparent metric?," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 553-556, April.
    6. Lutz Bornmann & Hans‐Dieter Daniel, 2007. "What do we know about the h index?," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(9), pages 1381-1385, July.
    7. Walters, William H., 2017. "Do subjective journal ratings represent whole journals or typical articles? Unweighted or weighted citation impact?," Journal of Informetrics, Elsevier, vol. 11(3), pages 730-744.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Boontarika Paphawasit & Ratapol Wudhikarn, 2022. "Investigating Patterns of Research Collaboration and Citations in Science and Technology: A Case of Chiang Mai University," Administrative Sciences, MDPI, vol. 12(2), pages 1-28, June.
    2. Libor Ansorge & Klára Ansorgeová & Mark Sixsmith, 2021. "Plagiarism through Paraphrasing Tools—The Story of One Plagiarized Text," Publications, MDPI, vol. 9(4), pages 1-10, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Raminta Pranckutė, 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World," Publications, MDPI, vol. 9(1), pages 1-59, March.
    2. Mingkun Wei, 2020. "Research on impact evaluation of open access journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1027-1049, February.
    3. Ana Teresa Santos & Sandro Mendonça, 2022. "Do papers (really) match journals’ “aims and scope”? A computational assessment of innovation studies," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7449-7470, December.
    4. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    5. Yves Fassin, 2021. "Does the Financial Times FT50 journal list select the best management and economics journals?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5911-5943, July.
    6. Judit Dobránszki & Jaime A. Teixeira da Silva, 2019. "Corrective factors for author- and journal-based metrics impacted by citations to accommodate for retractions," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 387-398, October.
    7. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).
    8. Lucy Semerjian & Kunle Okaiyeto & Mike O. Ojemaye & Temitope Cyrus Ekundayo & Aboi Igwaran & Anthony I. Okoh, 2021. "Global Systematic Mapping of Road Dust Research from 1906 to 2020: Research Gaps and Future Direction," Sustainability, MDPI, vol. 13(20), pages 1-21, October.
    9. Lin Zhang & Yuanyuan Shang & Ying Huang & Gunnar Sivertsen, 2022. "Gender differences among active reviewers: an investigation based on publons," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 145-179, January.
    10. Stephan Puehringer & Johanna Rath & Teresa Griesebner, 2021. "The political economy of academic publishing: On the commodification of a public good," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-21, June.
    11. Solomon, David J. & Laakso, Mikael & Björk, Bo-Christer, 2013. "A longitudinal comparison of citation rates and growth among open access journals," Journal of Informetrics, Elsevier, vol. 7(3), pages 642-650.
    12. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2010. "The h index research output measurement: Two approaches to enhance its accuracy," Journal of Informetrics, Elsevier, vol. 4(3), pages 407-414.
    13. Abramo, Giovanni & D'Angelo, Ciriaco Andrea & Di Costa, Flavia, 2021. "The scholarly impact of private sector research: A multivariate analysis," Journal of Informetrics, Elsevier, vol. 15(3).
    14. Karin Langenkamp & Bodo Rödel & Kerstin Taufenbach & Meike Weiland, 2018. "Open Access in Vocational Education and Training Research," Publications, MDPI, vol. 6(3), pages 1-12, July.
    15. Giovanni Abramo & Ciriaco Andrea D'Angelo & Flavia Di Costa, 2020. "The relative impact of private research on scientific advancement," Papers 2012.04908, arXiv.org.
    16. Zoltán Krajcsák, 2021. "Researcher Performance in Scopus Articles ( RPSA ) as a New Scientometric Model of Scientific Output: Tested in Business Area of V4 Countries," Publications, MDPI, vol. 9(4), pages 1-23, October.
    17. Abramo, Giovanni & Aksnes, Dag W. & D’Angelo, Ciriaco Andrea, 2020. "Comparison of research performance of Italian and Norwegian professors and universities," Journal of Informetrics, Elsevier, vol. 14(2).
    18. You, Taekho & Park, Jinseo & Lee, June Young & Yun, Jinhyuk & Jung, Woo-Sung, 2022. "Disturbance of questionable publishing to academia," Journal of Informetrics, Elsevier, vol. 16(2).
    19. Aniruddha Maiti & Sai Shi & Slobodan Vucetic, 2023. "An ablation study on the use of publication venue quality to rank computer science departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4197-4218, August.
    20. Marcin Kozak & Lutz Bornmann, 2012. "A New Family of Cumulative Indexes for Measuring Scientific Performance," PLOS ONE, Public Library of Science, vol. 7(10), pages 1-4, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jpubli:v:9:y:2021:i:2:p:16-:d:541269. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.