IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v106y2016i3d10.1007_s11192-016-1836-2.html
   My bibliography  Save this article

A Sciento-text framework to characterize research strength of institutions at fine-grained thematic area level

Author

Listed:
  • Ashraf Uddin

    (South Asian University)

  • Jaideep Bhoosreddy

    (University at Buffalo)

  • Marisha Tiwari

    (Banaras Hindu University)

  • Vivek Kumar Singh

    (Banaras Hindu University)

Abstract

This paper presents a Sciento-text framework to characterize and assess research performance of leading world institutions in fine-grained thematic areas. While most of the popular university research rankings rank universities either on their overall research performance or on a particular subject, we have tried to devise a system to identify strong research centres at a more fine-grained level of research themes of a subject. Computer science (CS) research output of more than 400 universities in the world is taken as the case in point to demonstrate the working of the framework. The Sciento-text framework comprises of standard scientometric and text analytics components. First of all every research paper in the data is classified into different thematic areas in a systematic manner and then standard scientometric methodology is used to identify and assess research strengths of different institutions in a particular research theme (say Artificial Intelligence for CS domain). The performance of framework components is evaluated and the complete system is deployed on the Web at url: www.universityselectplus.com . The framework is extendable to other subject domains with little modification.

Suggested Citation

  • Ashraf Uddin & Jaideep Bhoosreddy & Marisha Tiwari & Vivek Kumar Singh, 2016. "A Sciento-text framework to characterize research strength of institutions at fine-grained thematic area level," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(3), pages 1135-1150, March.
  • Handle: RePEc:spr:scient:v:106:y:2016:i:3:d:10.1007_s11192-016-1836-2
    DOI: 10.1007/s11192-016-1836-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-1836-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-1836-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zhang, Lin & Liu, Xinhai & Janssens, Frizo & Liang, Liming & Glänzel, Wolfgang, 2010. "Subject clustering analysis based on ISI category classification," Journal of Informetrics, Elsevier, vol. 4(2), pages 185-193.
    2. Aparna Basu & Ritu Aggarwal, 2001. "International Collaboration in Science in India and its Impact on Institutional Performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 52(3), pages 379-394, November.
    3. Bordons, María & Aparicio, Javier & González-Albo, Borja & Díaz-Faes, Adrián A., 2015. "The relationship between the research performance of scientists and their position in co-authorship networks in three fields," Journal of Informetrics, Elsevier, vol. 9(1), pages 135-144.
    4. Loet Leydesdorff & Lutz Bornmann & Rüdiger Mutz & Tobias Opthof, 2011. "Turning the tables on citation analysis one more time: Principles for comparing sets of documents," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(7), pages 1370-1381, July.
    5. Wolfgang Glänzel & Henk F. Moed, 2013. "Opinion paper: thoughts and facts on bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(1), pages 381-394, July.
    6. Lutz Bornmann & Felix Moya Anegón & Rüdiger Mutz, 2013. "Do universities or research institutions with a specific subject profile have an advantage or a disadvantage in institutional rankings?," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(11), pages 2310-2316, November.
    7. Bornmann, Lutz & Leydesdorff, Loet & Wang, Jian, 2013. "Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P1," Journal of Informetrics, Elsevier, vol. 7(4), pages 933-944.
    8. Bornmann, Lutz & Leydesdorff, Loet & Mutz, Rüdiger, 2013. "The use of percentiles and percentile rank classes in the analysis of bibliometric data: Opportunities and limits," Journal of Informetrics, Elsevier, vol. 7(1), pages 158-165.
    9. Necmi Avkiran & Karen Alpert, 2015. "The influence of co-authorship on article impact in OR/MS/OM and the exchange of knowledge with Finance in the twenty-first century," Annals of Operations Research, Springer, vol. 235(1), pages 51-73, December.
    10. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    11. Loet Leydesdorff & Lutz Bornmann, 2011. "Integrated impact indicators compared with impact factors: An alternative research design with policy implications," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(11), pages 2133-2146, November.
    12. Ismael Rafols & Loet Leydesdorff, 2009. "Content‐based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 60(9), pages 1823-1835, September.
    13. Vivek Kumar Singh & Ashraf Uddin & David Pinto, 2015. "Computer science research: the top 100 institutions in India and in the world," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(2), pages 529-553, August.
    14. Lorenzo Ductor, 2015. "Does Co-authorship Lead to Higher Academic Productivity?," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 77(3), pages 385-407, June.
    15. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    16. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    17. Alain Molinari & Jean-Francois Molinari, 2008. "Mathematical aspects of a new criterion for ranking scientific institutions based on the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 75(2), pages 339-356, May.
    18. Ludo Waltman & Michael Schreiber, 2013. "On the calculation of percentile-based bibliometric indicators," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 372-379, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Hiran H. Lathabai & Abhirup Nandy & Vivek Kumar Singh, 2021. "x-index: Identifying core competency and thematic research strengths of institutions using an NLP and network based ranking framework," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9557-9583, December.
    2. Aparna Basu & Sumit Kumar Banshal & Khushboo Singhal & Vivek Kumar Singh, 2016. "Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1171-1193, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Schreiber, Michael, 2014. "How to improve the outcome of performance evaluations in terms of percentiles for citation frequencies of my papers," Journal of Informetrics, Elsevier, vol. 8(4), pages 873-879.
    2. Aparna Basu & Sumit Kumar Banshal & Khushboo Singhal & Vivek Kumar Singh, 2016. "Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1171-1193, June.
    3. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    4. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    5. Brito, Ricardo & Rodríguez-Navarro, Alonso, 2018. "Research assessment by percentile-based double rank analysis," Journal of Informetrics, Elsevier, vol. 12(1), pages 315-329.
    6. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    7. Yves Fassin, 2020. "The HF-rating as a universal complement to the h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 965-990, November.
    8. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.
    9. Albarrán, Pedro & Herrero, Carmen & Ruiz-Castillo, Javier & Villar, Antonio, 2017. "The Herrero-Villar approach to citation impact," Journal of Informetrics, Elsevier, vol. 11(2), pages 625-640.
    10. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    11. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    12. Bornmann, Lutz & Haunschild, Robin, 2016. "Citation score normalized by cited references (CSNCR): The introduction of a new citation impact indicator," Journal of Informetrics, Elsevier, vol. 10(3), pages 875-887.
    13. Schreiber, Michael, 2014. "Is the new citation-rank approach P100′ in bibliometrics really new?," Journal of Informetrics, Elsevier, vol. 8(4), pages 997-1004.
    14. Bornmann, Lutz & Leydesdorff, Loet & Wang, Jian, 2013. "Which percentile-based approach should be preferred for calculating normalized citation impact values? An empirical comparison of five approaches including a newly developed citation-rank approach (P1," Journal of Informetrics, Elsevier, vol. 7(4), pages 933-944.
    15. Bouyssou, Denis & Marchant, Thierry, 2016. "Ranking authors using fractional counting of citations: An axiomatic approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 183-199.
    16. Waltman, Ludo & van Eck, Nees Jan, 2013. "A systematic empirical comparison of different approaches for normalizing citation impact indicators," Journal of Informetrics, Elsevier, vol. 7(4), pages 833-849.
    17. Lutz Bornmann & Robin Haunschild, 2016. "How to normalize Twitter counts? A first attempt based on journals in the Twitter Index," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1405-1422, June.
    18. Diego Chavarro & Puay Tang & Ismael Rafols, 2014. "Interdisciplinarity and research on local issues: evidence from a developing country," Research Evaluation, Oxford University Press, vol. 23(3), pages 195-209.
    19. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.
    20. Lin Zhang & Beibei Sun & Fei Shu & Ying Huang, 2022. "Comparing paper level classifications across different methods and systems: an investigation of Nature publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 7633-7651, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:106:y:2016:i:3:d:10.1007_s11192-016-1836-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.