IDEAS home Printed from https://ideas.repec.org/p/cpr/ceprdp/9724.html
   My bibliography  Save this paper

Bibliometric Evaluation vs. Informed Peer Review: Evidence from Italy

Author

Listed:
  • Jappelli, Tullio
  • Peracchi, Franco
  • Bertocchi, Graziella
  • Gambardella, Alfonso
  • Nappi, Carmela A

Abstract

A relevant question for the organization of large scale research assessments is whether bibliometric evaluation and informed peer review where reviewers know where the work was published, yield similar results. It would suggest, for instance, that less costly bibliometric evaluation might - at least partly - replace informed peer review, or that bibliometric evaluation could reliably monitor research in between assessment exercises. We draw on our experience of evaluating Italian research in Economics, Business and Statistics, where almost 12,000 publications dated 2004-2010 were assessed. A random sample from the available population of journal articles shows that informed peer review and bibliometric analysis produce similar evaluations of the same set of papers. Whether because of independent convergence in assessment, or the influence of bibliometric information on the community of reviewers, the implication for the organization of these exercises is that these two approaches are substitutes.

Suggested Citation

  • Jappelli, Tullio & Peracchi, Franco & Bertocchi, Graziella & Gambardella, Alfonso & Nappi, Carmela A, 2013. "Bibliometric Evaluation vs. Informed Peer Review: Evidence from Italy," CEPR Discussion Papers 9724, C.E.P.R. Discussion Papers.
  • Handle: RePEc:cpr:ceprdp:9724
    as

    Download full text from publisher

    File URL: https://cepr.org/publications/DP9724
    Download Restriction: CEPR Discussion Papers are free to download for our researchers, subscribers and members. If you fall into one of these categories but have trouble downloading our papers, please contact us at subscribers@cepr.org
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Moed, H. F. & Burger, W. J. M. & Frankfort, J. G. & Van Raan, A. F. J., 1985. "The use of bibliometric data for the measurement of university research performance," Research Policy, Elsevier, vol. 14(3), pages 131-149, June.
    2. Adam Eyre-Walker & Nina Stoletzki, 2013. "The Assessment of Science: The Relative Merits of Post-Publication Review, the Impact Factor, and the Number of Citations," PLOS Biology, Public Library of Science, vol. 11(10), pages 1-8, October.
    3. Andrew J. Oswald, 2007. "An Examination of the Reliability of Prestigious Scholarly Journals: Evidence and Implications for Decision‐Makers," Economica, London School of Economics and Political Science, vol. 74(293), pages 21-31, February.
    4. David I. Stern, 2013. "Uncertainty Measures for Economics Journal Impact Factors," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 173-189, March.
    5. Henk F Moed, 2007. "The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review," Science and Public Policy, Oxford University Press, vol. 34(8), pages 575-583, October.
    6. Dimitrios Christelis, 2011. "Imputation of Missing Data in Waves 1 and 2 of SHARE," CSEF Working Papers 278, Centre for Studies in Economics and Finance (CSEF), University of Naples, Italy.
    7. Daniel Sgroi & Andrew J. Oswald, 2013. "How Should Peer‐review Panels Behave?," Economic Journal, Royal Economic Society, vol. 0, pages 255-278, August.
    8. Rebora, Gianfranco & Turri, Matteo, 2013. "The UK and Italian research assessment exercises face to face," Research Policy, Elsevier, vol. 42(9), pages 1657-1666.
    9. Diana Hicks, 1999. "The difficulty of achieving full coverage of international social science literature and the bibliometric consequences," Scientometrics, Springer;Akadémiai Kiadó, vol. 44(2), pages 193-215, February.
    10. Diana Hicks & Jian Wang, 2011. "Coverage and overlap of the new social sciences and humanities journal lists," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 62(2), pages 284-294, February.
    11. Fagerberg, Jan & Landström, Hans & Martin, Ben R., 2012. "Exploring the emerging knowledge base of ‘the knowledge society’," Research Policy, Elsevier, vol. 41(7), pages 1121-1131.
    12. Francesco Bartolucci & Valentino Dardanoni & Franco Peracchi, 2013. "Ranking Scientific Journals via Latent Class Models for Polytomous Item Response," EIEF Working Papers Series 1313, Einaudi Institute for Economics and Finance (EIEF), revised May 2013.
    13. Linda Butler, 2007. "Assessing university research: A plea for a balanced approach," Science and Public Policy, Oxford University Press, vol. 34(8), pages 565-574, October.
    14. Pierre-Philippe Combes & Laurent Linnemer, 2010. "Inferring Missing Citations: A Quantitative Multi-Criteria Ranking of all Journals in Economics," Working Papers halshs-00520325, HAL.
    15. Rafols, Ismael & Leydesdorff, Loet & O’Hare, Alice & Nightingale, Paul & Stirling, Andy, 2012. "How journal rankings can suppress interdisciplinary research: A comparison between Innovation Studies and Business & Management," Research Policy, Elsevier, vol. 41(7), pages 1262-1282.
    16. Per O. Seglen, 1992. "The skewness of science," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 43(9), pages 628-638, October.
    17. Diana Hicks & Jian Wang, 2011. "Coverage and overlap of the new social sciences and humanities journal lists," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 62(2), pages 284-294, February.
    18. Hicks, Diana, 2012. "Performance-based university research funding systems," Research Policy, Elsevier, vol. 41(2), pages 251-261.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Mingers, John & Yang, Liying, 2017. "Evaluating journal quality: A review of journal citation indicators and ranking in business and management," European Journal of Operational Research, Elsevier, vol. 257(1), pages 323-337.
    2. John Gibson & David L. Anderson & John Tressler, 2017. "Citations Or Journal Quality: Which Is Rewarded More In The Academic Labor Market?," Economic Inquiry, Western Economic Association International, vol. 55(4), pages 1945-1965, October.
    3. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    4. Buehling, Kilian, 2021. "Changing research topic trends as an effect of publication rankings – The case of German economists and the Handelsblatt Ranking," Journal of Informetrics, Elsevier, vol. 15(3).
    5. Berlemann, Michael & Haucap, Justus, 2015. "Which factors drive the decision to opt out of individual research rankings? An empirical study of academic resistance to change," Research Policy, Elsevier, vol. 44(5), pages 1108-1115.
    6. Brooks, Chris & Fenton, Evelyn M. & Walker, James T., 2014. "Gender and the evaluation of research," Research Policy, Elsevier, vol. 43(6), pages 990-1001.
    7. Rebora, Gianfranco & Turri, Matteo, 2013. "The UK and Italian research assessment exercises face to face," Research Policy, Elsevier, vol. 42(9), pages 1657-1666.
    8. David L. Anderson & John Tressler, 2016. "Citation-Capture Rates for Economics Journals: Do they Differ from Other Disciplines and Does it Matter?," Economic Papers, The Economic Society of Australia, vol. 35(1), pages 73-85, March.
    9. Ekaterina L. Dyachenko, 2014. "Internationalization of academic journals: Is there still a gap between social and natural sciences?," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 241-255, October.
    10. Giovanni Abramo & Ciriaco Andrea D’Angelo & Myroslava Hladchenko, 2023. "Assessing the effects of publication requirements for professorship on research performance and publishing behaviour of Ukrainian academics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4589-4609, August.
    11. Christian Schneijderberg & Nicolai Götze & Lars Müller, 2022. "A study of 25 years of publication outputs in the German academic profession," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(1), pages 1-28, January.
    12. Johannes König & David I. Stern & Richard S.J. Tol, 2022. "Confidence Intervals for Recursive Journal Impact Factors," Tinbergen Institute Discussion Papers 22-038/III, Tinbergen Institute.
    13. David A. Pendlebury, 2019. "Charting a path between the simple and the false and the complex and unusable: Review of Henk F. Moed, Applied Evaluative Informetrics [in the series Qualitative and Quantitative Analysis of Scientifi," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 549-560, April.
    14. Salter, Ammon & Salandra, Rossella & Walker, James, 2017. "Exploring preferences for impact versus publications among UK business and management academics," Research Policy, Elsevier, vol. 46(10), pages 1769-1782.
    15. Rafols, Ismael & Stirling, Andy, 2020. "Designing indicators for opening up evaluation. Insights from research assessment," SocArXiv h2fxp, Center for Open Science.
    16. Gabriel-Alexandru Vîiu & Mihai Păunescu, 2021. "The citation impact of articles from which authors gained monetary rewards based on journal metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4941-4974, June.
    17. David L. Anderson & John Tressler, 2017. "Researcher rank stability across alternative output measurement schemes in the context of a time limited research evaluation: the New Zealand case," Applied Economics, Taylor & Francis Journals, vol. 49(45), pages 4542-4553, September.
    18. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    19. Loet Leydesdorff & Paul Wouters & Lutz Bornmann, 2016. "Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2129-2150, December.
    20. Vasilios D. Kosteas, 2018. "Predicting long-run citation counts for articles in top economics journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(3), pages 1395-1412, June.

    More about this item

    Keywords

    Bibliometric evaluation; Peer review; Research assessment; Vqr;
    All these keywords.

    JEL classification:

    • C80 - Mathematical and Quantitative Methods - - Data Collection and Data Estimation Methodology; Computer Programs - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions
    • O30 - Economic Development, Innovation, Technological Change, and Growth - - Innovation; Research and Development; Technological Change; Intellectual Property Rights - - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cpr:ceprdp:9724. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://www.cepr.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.