IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v13y2019i3p804-816.html
   My bibliography  Save this article

Publication counting methods for a national research evaluation exercise

Author

Listed:
  • Korytkowski, Przemyslaw
  • Kulczycki, Emanuel

Abstract

In this paper, we investigate the effects of using four methods of publication counting (complete, whole, fractional, square root fractional) and limiting the number of publications (at researcher and institution levels) on the results of a national research evaluation exercise across fields using Polish data. We use bibliographic information on 0.58 million publications from the 2013–2016 period. Our analysis reveals that the largest effects are in those fields within which a variety publication and cooperation patterns can be observed (e.g. in Physical sciences or History and archeology). We argue that selecting the publication counting method for national evaluation purposes needs to take into account the current situation in the given country in terms of the excellence of research outcomes, level of internal, external and international collaboration, and publication patterns in the various fields of sciences. Our findings show that the social sciences and humanities are not significantly influenced by the different publication counting methods and limiting the number of publications included in the evaluation, as publication patterns in these fields are quite different from those observed in the so-called hard sciences. When discussing the goals of any national research evaluation system, we should be aware that the ways of achieving these goals are closely related to the publication counting method, which can serve as incentives for certain publication practices.

Suggested Citation

  • Korytkowski, Przemyslaw & Kulczycki, Emanuel, 2019. "Publication counting methods for a national research evaluation exercise," Journal of Informetrics, Elsevier, vol. 13(3), pages 804-816.
  • Handle: RePEc:eee:infome:v:13:y:2019:i:3:p:804-816
    DOI: 10.1016/j.joi.2019.07.001
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157718305029
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2019.07.001?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ludo Waltman & Clara Calero‐Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan van Eck & Thed N. van Leeuwen & Anthony F.J. van Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    2. Kaare Aagaard & Carter Bloch & Jesper W. Schneider, 2015. "Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator," Research Evaluation, Oxford University Press, vol. 24(2), pages 106-117.
    3. Hagen, Nils T., 2014. "Counting and comparing publication output with and without equalizing and inflationary bias," Journal of Informetrics, Elsevier, vol. 8(2), pages 310-317.
    4. Emanuel Kulczycki & Tim C. E. Engels & Janne Pölönen & Kasper Bruun & Marta Dušková & Raf Guns & Robert Nowotniak & Michal Petr & Gunnar Sivertsen & Andreja Istenič Starčič & Alesia Zuccala, 2018. "Publication patterns in the social sciences and humanities: evidence from eight European countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 463-486, July.
    5. Emanuel Kulczycki & Ewa A. Rozkosz, 2017. "Does an expert-based evaluation allow us to go beyond the Impact Factor? Experiences from building a ranking of national journals in Poland," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(1), pages 417-442, April.
    6. Gunnar Sivertsen & Birger Larsen, 2012. "Comprehensive bibliographic coverage of the social sciences and humanities in a citation index: an empirical analysis of the potential," Scientometrics, Springer;Akadémiai Kiadó, vol. 91(2), pages 567-575, May.
    7. Marianne Gauffriau & Peder Olesen Larsen & Isabelle Maye & Anne Roulin-Perriard & Markus Ins, 2008. "Comparisons of results of publication counting using different methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(1), pages 147-176, October.
    8. Fredrik Niclas Piro & Dag W. Aksnes & Kristoffer Rørstad, 2013. "A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(2), pages 307-320, February.
    9. Ludo Waltman & Clara Calero-Medina & Joost Kosten & Ed C.M. Noyons & Robert J.W. Tijssen & Nees Jan Eck & Thed N. Leeuwen & Anthony F.J. Raan & Martijn S. Visser & Paul Wouters, 2012. "The Leiden ranking 2011/2012: Data collection, indicators, and interpretation," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2419-2432, December.
    10. Gauffriau, Marianne, 2017. "A categorization of arguments for counting methods for publication and citation indicators," Journal of Informetrics, Elsevier, vol. 11(3), pages 672-684.
    11. Mu-Hsuan Huang & Chi-Shiou Lin & Dar-Zen Chen, 2011. "Counting methods, country rank changes, and counting inflation in the assessment of national research productivity and impact," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(12), pages 2427-2436, December.
    12. Linda Sīle & Janne Pölönen & Gunnar Sivertsen & Raf Guns & Tim C E Engels & Pavel Arefiev & Marta Dušková & Lotte Faurbæk & András Holl & Emanuel Kulczycki & Bojan Macan & Gustaf Nelhans & Michal Petr, 2018. "Comprehensiveness of national bibliographic databases for social sciences and humanities: Findings from a European survey," Research Evaluation, Oxford University Press, vol. 27(4), pages 310-322.
    13. Ad A.M. Prins & Rodrigo Costas & Thed N. van Leeuwen & Paul F. Wouters, 2016. "Using Google Scholar in research evaluation of humanities and social science programs: A comparison with Web of Science data," Research Evaluation, Oxford University Press, vol. 25(3), pages 264-270.
    14. Mu‐Hsuan Huang & Chi‐Shiou Lin & Dar‐Zen Chen, 2011. "Counting methods, country rank changes, and counting inflation in the assessment of national research productivity and impact," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(12), pages 2427-2436, December.
    15. Kulczycki, Emanuel & Korzeń, Marcin & Korytkowski, Przemysław, 2017. "Toward an excellence-based research funding system: Evidence from Poland," Journal of Informetrics, Elsevier, vol. 11(1), pages 282-298.
    16. Fredrik Niclas Piro & Dag W. Aksnes & Kristoffer Rørstad, 2013. "A macro analysis of productivity differences across fields: Challenges in the measurement of scientific publishing," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(2), pages 307-320, February.
    17. Peder Olesen Larsen, 2008. "The state of the art in publication counting," Scientometrics, Springer;Akadémiai Kiadó, vol. 77(2), pages 235-251, November.
    18. Waltman, Ludo & van Eck, Nees Jan, 2015. "Field-normalized citation impact indicators and the choice of an appropriate counting method," Journal of Informetrics, Elsevier, vol. 9(4), pages 872-894.
    19. Truyken L. B. Ossenblok & Tim C. E. Engels & Gunnar Sivertsen, 2012. "The representation of the social sciences and humanities in the Web of Science--a comparison of publication patterns and incentive structures in Flanders and Norway (2005--9)," Research Evaluation, Oxford University Press, vol. 21(4), pages 280-290, September.
    20. Aksnes, Dag W. & Schneider, Jesper W. & Gunnarsson, Magnus, 2012. "Ranking national research systems by citation indicators. A comparative analysis using whole and fractionalised counting methods," Journal of Informetrics, Elsevier, vol. 6(1), pages 36-43.
    21. G. Van Hooydonk, 1997. "Fractional counting of multiauthored publications: Consequences for the impact of authors," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 48(10), pages 944-945, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Cappelletti-Montano, Beniamino & Columbu, Silvia & Montaldo, Stefano & Musio, Monica, 2022. "Interpreting the outcomes of research assessments: A geometrical approach," Journal of Informetrics, Elsevier, vol. 16(1).
    2. Roberta Ruggieri & Fabrizio Pecoraro & Daniela Luzi, 2021. "An intersectional approach to analyse gender productivity and open access: a bibliometric analysis of the Italian National Research Council," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(2), pages 1647-1673, February.
    3. Emanuel Kulczycki & Przemysław Korytkowski, 2020. "Researchers publishing monographs are more productive and more local-oriented," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1371-1387, November.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    2. Hagen, Nils T., 2014. "Counting and comparing publication output with and without equalizing and inflationary bias," Journal of Informetrics, Elsevier, vol. 8(2), pages 310-317.
    3. Pär Sundling, 2023. "Author contributions and allocation of authorship credit: testing the validity of different counting methods in the field of chemical biology," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(5), pages 2737-2762, May.
    4. Rahman, Mohammad Tariqur & Regenstein, Joe Mac & Kassim, Noor Lide Abu & Haque, Nazmul, 2017. "The need to quantify authors’ relative intellectual contributions in a multi-author paper," Journal of Informetrics, Elsevier, vol. 11(1), pages 275-281.
    5. Waltman, Ludo & van Eck, Nees Jan, 2015. "Field-normalized citation impact indicators and the choice of an appropriate counting method," Journal of Informetrics, Elsevier, vol. 9(4), pages 872-894.
    6. Fairclough, Ruth & Thelwall, Mike, 2015. "National research impact indicators from Mendeley readers," Journal of Informetrics, Elsevier, vol. 9(4), pages 845-859.
    7. Jesper W. Schneider & Thed Leeuwen & Martijn Visser & Kaare Aagaard, 2019. "Examining national citation impact by comparing developments in a fixed and a dynamic journal set," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 973-985, May.
    8. Denis Kosyakov & Andrey Guskov, 2022. "Reasons and consequences of changes in Russian research assessment policies," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4609-4630, August.
    9. Emanuel Kulczycki & Przemysław Korytkowski, 2020. "Researchers publishing monographs are more productive and more local-oriented," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1371-1387, November.
    10. Potter, Ross W.K. & Szomszor, Martin & Adams, Jonathan, 2020. "Interpreting CNCIs on a country-scale: The effect of domestic and international collaboration type," Journal of Informetrics, Elsevier, vol. 14(4).
    11. Jeffrey Demaine, 2022. "Fractionalization of research impact reveals global trends in university collaboration," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2235-2247, May.
    12. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    13. Sandro Tarkhan-Mouravi, 2020. "Traditional indicators inflate some countries’ scientific impact over 10 times," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 337-356, April.
    14. Torger Möller & Marion Schmidt & Stefan Hornbostel, 2016. "Assessing the effects of the German Excellence Initiative with bibliometric methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2217-2239, December.
    15. Saarela, Mirka & Kärkkäinen, Tommi, 2020. "Can we automate expert-based journal rankings? Analysis of the Finnish publication indicator," Journal of Informetrics, Elsevier, vol. 14(2).
    16. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
    17. Csomós, György, 2020. "Introducing recalibrated academic performance indicators in the evaluation of individuals’ research performance: A case study from Eastern Europe," Journal of Informetrics, Elsevier, vol. 14(4).
    18. Gauffriau, Marianne, 2017. "A categorization of arguments for counting methods for publication and citation indicators," Journal of Informetrics, Elsevier, vol. 11(3), pages 672-684.
    19. Przemysław Korytkowski & Emanuel Kulczycki, 2019. "Examining how country-level science policy shapes publication patterns: the case of Poland," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1519-1543, June.
    20. Lutz Bornmann & Richard Williams, 2020. "An evaluation of percentile measures of citation impact, and a proposal for making them better," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1457-1478, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:13:y:2019:i:3:p:804-816. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.