IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v119y2019i1d10.1007_s11192-019-03035-w.html
   My bibliography  Save this article

A data science-based framework to categorize academic journals

Author

Listed:
  • Zahid Halim

    (Ghulam Ishaq Khan Institute of Engineering Sciences and Technology)

  • Shafaq Khan

    (University of Management and Technology)

Abstract

Academic journals play a significant role in the dissemination of new research insights and knowledge among scientists. The number of such journals has recently increased significantly. Scientists prefer to publish their scholarly work at reputed venues. Speed of publication is also an import factor considered by many while selecting a publication venue. To evaluate a journal’s quality, few of the key indicators include impact factor, Source Normalized Impact per Paper (SNIP), and Hirsch index (h-index). Journals’ ranking is an indication of their impact and quality with respect to other venues in a specific discipline. Various measures can be utilized for ranking, like, field specific statistics, intra discipline ranking, or a combination of both. Earlier, the journals’ ranking was done through a manual process by providing an institutional list created by academic leaders. Factors like politicization, biases, and personal interests were the key issues with such categorization. Later, the process evolved to a database system based on impact factor, SNIP (Source Normalized Impact per Paper), h-index, or any combination of these. All this demanded an external source of categorizing academic journals. This work presents a data science-based framework that evaluates journals based on their key bibliometric indicators and presents an automated approach to categorize them. For this, the current proposal is restricted to the journals published in the computer science domain. The journal’s features considered in the proposed framework include: publisher, impact factor, website, CiteScore, SJR (SCImago Journal & Country Rank), SNIP, h-index, country, age, cited half-life, immediacy factor/index, Eigenfactor score, article influence score, open access, percentile, citations, acceptance rate, peer review, and the number of articles published yearly. A dataset is collected for 660 journals consisting of these 19 features. The dataset is preprocessed to fill-in the missing values and perform scaling. Three feature selection techniques, namely, Mutual Information (MI), minimum Redundancy Maximum Relevance (mRMR), and Statistical Dependency (SD) are used to rank the aforementioned features. The dataset is then vertically divided into three sets, all features, top nine features, and bottom ten features. Later, two clustering techniques, namely, k-means and k-medoids are employed to find the optimum number of coherent groups in the dataset. Based on a rigorous evaluation, four groups of journals are identified. It is followed by training two classifiers, i.e., k-NN (Nearest Neighbor) and Artificial Neural Network (ANN) to predict the category of an unknown journal. Where, the ANN shows an average accuracy of 82.85%. A descriptive analysis of the clusters formed is also presented to gain insights about the four journal categories. The proposed framework provides an opportunity to independently categorize academic journals based on data science methods using multiple significant bibliometric indicators.

Suggested Citation

  • Zahid Halim & Shafaq Khan, 2019. "A data science-based framework to categorize academic journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 393-423, April.
  • Handle: RePEc:spr:scient:v:119:y:2019:i:1:d:10.1007_s11192-019-03035-w
    DOI: 10.1007/s11192-019-03035-w
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-019-03035-w
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-019-03035-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Chiang Kao & Hsiou-Wei Lin & San-Lin Chung & Wei-Chi Tsai & Jyh-Shen Chiou & Yen-Liang Chen & Liang-Hsuan Chen & Shih-Chieh Fang & Hwei-Lan Pao, 2008. "Ranking Taiwanese management journals: A case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 76(1), pages 95-115, July.
    2. Lokman I. Meho & Yvonne Rogers, 2008. "Citation counting, citation ranking, and h‐index of human‐computer interaction researchers: A comparison of Scopus and Web of Science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(11), pages 1711-1726, September.
    3. Frederick H. Wallace & Timothy J. Perri, 2018. "Economists behaving badly: publications in predatory journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(2), pages 749-766, May.
    4. Wolfgang Glänzel & Henk F. Moed, 2002. "Journal impact measures in bibliometric research," Scientometrics, Springer;Akadémiai Kiadó, vol. 53(2), pages 171-193, February.
    5. Steven N. Goodman, 2018. "A quality-control test for predatory journals," Nature, Nature, vol. 553(7687), pages 155-155, January.
    6. Henk F. Moed, 2011. "The source normalized impact per paper is a valid and sophisticated indicator of journal citation impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(1), pages 211-213, January.
    7. Bouyssou, Denis & Marchant, Thierry, 2011. "Bibliometric rankings of journals based on Impact Factors: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 5(1), pages 75-86.
    8. Arne Risa Hole, 2017. "Ranking Economics Journals Using Data From a National Research Evaluation Exercise," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 79(5), pages 621-636, October.
    9. Henk F. Moed, 2011. "The source normalized impact per paper is a valid and sophisticated indicator of journal citation impact," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(1), pages 211-213, January.
    10. E. Garfield & I. H. Sher, 1963. "New factors in the evaluation of scientific literature through citation indexing," American Documentation, Wiley Blackwell, vol. 14(3), pages 195-201, July.
    11. Fuad T. Aleskerov & Vladimir V. Pislyakov & Andrey N. Subochev, 2014. "Ranking Journals In Economics, Management And Political Science By Social Choice Theory Methods," HSE Working papers WP BRP 27/STI/2014, National Research University Higher School of Economics.
    12. González-Pereira, Borja & Guerrero-Bote, Vicente P. & Moya-Anegón, Félix, 2010. "A new approach to the metric of journals’ scientific prestige: The SJR indicator," Journal of Informetrics, Elsevier, vol. 4(3), pages 379-391.
    13. Vaccario, Giacomo & Medo, Matúš & Wider, Nicolas & Mariani, Manuel Sebastian, 2017. "Quantifying and suppressing ranking bias in a large citation network," Journal of Informetrics, Elsevier, vol. 11(3), pages 766-782.
    14. Aleskerov, Fuad & Chistyakov, Vyacheslav V. & Kalyagin, Valery, 2010. "The threshold aggregation," Economics Letters, Elsevier, vol. 107(2), pages 261-262, May.
    15. Fuad T. Aleskerov & Vladimir V. Pislyakov & Timur V. Vitkup, 2014. "Ranking Journals In Economics, Management And Political Sciences By The Threshold Aggregation Procedure," HSE Working papers WP BRP 73/EC/2014, National Research University Higher School of Economics.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Nisar Ali & Zahid Halim & Syed Fawad Hussain, 2023. "An artificial intelligence-based framework for data-driven categorization of computer scientists: a case study of world’s Top 10 computing departments," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1513-1545, March.
    2. Yves Fassin, 2021. "Does the Financial Times FT50 journal list select the best management and economics journals?," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(7), pages 5911-5943, July.
    3. Adela Laura Popa & Naiana Nicoleta Ţarcă & Dinu Vlad Sasu & Simona Aurelia Bodog & Remus Dorel Roşca & Teodora Mihaela Tarcza, 2022. "Exploring Marketing Insights for Healthcare: Trends and Perspectives Based on Literature Investigation," Sustainability, MDPI, vol. 14(17), pages 1-21, August.
    4. Lin Feng & Jian Zhou & Sheng-Lan Liu & Ning Cai & Jie Yang, 2020. "Analysis of journal evaluation indicators: an experimental study based on unsupervised Laplacian score," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(1), pages 233-254, July.
    5. Croft, William L. & Sack, Jörg-Rüdiger, 2022. "Predicting the citation count and CiteScore of journals one year in advance," Journal of Informetrics, Elsevier, vol. 16(4).
    6. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).
    7. Yu-Wei Chang, 2021. "Characteristics of high research performance authors in the field of library and information science and those of their articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3373-3391, April.
    8. José Luis Gallego Ortega & Antonio Rodríguez Fuentes & Antonio García Guzmán, 2021. "Application of Mathematical Methods to the Study of Special-Needs Education in Spanish Journals," Mathematics, MDPI, vol. 9(6), pages 1-17, March.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wolfgang Glänzel & Henk F. Moed, 2013. "Opinion paper: thoughts and facts on bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(1), pages 381-394, July.
    2. Mingkun Wei, 2020. "Research on impact evaluation of open access journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 1027-1049, February.
    3. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    4. Laura Vana & Ronald Hochreiter & Kurt Hornik, 2016. "Computing a journal meta-ranking using paired comparisons and adaptive lasso estimators," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(1), pages 229-251, January.
    5. Waltman, Ludo & van Eck, Nees Jan & van Leeuwen, Thed N. & Visser, Martijn S., 2013. "Some modifications to the SNIP journal impact indicator," Journal of Informetrics, Elsevier, vol. 7(2), pages 272-285.
    6. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    7. Juan Gorraiz & Ursula Ulrych & Wolfgang Glänzel & Wenceslao Arroyo-Machado & Daniel Torres-Salinas, 2022. "Measuring the excellence contribution at the journal level: an alternative to Garfield’s impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7229-7251, December.
    8. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    9. Dašić Predrag, 2015. "State and Analysis of Scientific Journals in the Field of “Economic Sciences” for the Period 1995-2014," Economic Themes, Sciendo, vol. 53(4), pages 547-581, December.
    10. Kun Lu & Isola Ajiferuke & Dietmar Wolfram, 2014. "Extending citer analysis to journal impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 100(1), pages 245-260, July.
    11. Fuad T. Aleskerov & Vladimir V. Pislyakov & Timur V. Vitkup, 2014. "Ranking Journals In Economics, Management And Political Sciences By The Threshold Aggregation Procedure," HSE Working papers WP BRP 73/EC/2014, National Research University Higher School of Economics.
    12. Michel Zitt, 2012. "The journal impact factor: angel, devil, or scapegoat? A comment on J.K. Vanclay’s article 2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 485-503, August.
    13. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    14. Henk F. Moed, 2016. "Comprehensive indicator comparisons intelligible to non-experts: the case of two SNIP versions," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(1), pages 51-65, January.
    15. Rosenthal, Edward C. & Weiss, Howard J., 2017. "A data envelopment analysis approach for ranking journals," Omega, Elsevier, vol. 70(C), pages 135-147.
    16. Fiorenzo Franceschini & Domenico Maisano & Luca Mastrogiacomo, 2014. "The citer-success-index: a citer-based indicator to select a subset of elite papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 963-983, November.
    17. Vladimir Pislyakov, 2009. "Comparing two “thermometers”: Impact factors of 20 leading economic journals according to Journal Citation Reports and Scopus," Scientometrics, Springer;Akadémiai Kiadó, vol. 79(3), pages 541-550, June.
    18. Xindi Wang & Zeshui Xu & Xinxin Wang & Marinko Skare, 2022. "A review of inflation from 1906 to 2022: a comprehensive analysis of inflation studies from a global perspective," Oeconomia Copernicana, Institute of Economic Research, vol. 13(3), pages 595-631, September.
    19. Cristina López-Duarte & Marta M. Vidal-Suárez & Belén González-Díaz & Nuno Rosa Reis, 2016. "Understanding the relevance of national culture in international business research: a quantitative analysis," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1553-1590, September.
    20. Subochev, A., 2016. "How Different Are the Existing Ratings of Russian Economic Journals and How to Unify Them?," Journal of the New Economic Association, New Economic Association, vol. 30(2), pages 181-192.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:119:y:2019:i:1:d:10.1007_s11192-019-03035-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.