IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v108y2016i3d10.1007_s11192-016-1929-y.html
   My bibliography  Save this article

Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise

Author

Listed:
  • Alberto Baccini

    (University of Siena)

  • Giuseppe De Nicolao

    (University of Pavia)

Abstract

During the Italian research assessment exercise, the national agency ANVUR performed an experiment to assess agreement between grades attributed to journal articles by informed peer review (IR) and by bibliometrics. A sample of articles was evaluated by using both methods and agreement was analyzed by weighted Cohen’s kappas. ANVUR presented results as indicating an overall “good” or “more than adequate” agreement. This paper re-examines the experiment results according to the available statistical guidelines for interpreting kappa values, by showing that the degree of agreement (always in the range 0.09–0.42) has to be interpreted, for all research fields, as unacceptable, poor or, in a few cases, as, at most, fair. The only notable exception, confirmed also by a statistical meta-analysis, was a moderate agreement for economics and statistics (Area 13) and its sub-fields. We show that the experiment protocol adopted in Area 13 was substantially modified with respect to all the other research fields, to the point that results for economics and statistics have to be considered as fatally flawed. The evidence of a poor agreement supports the conclusion that IR and bibliometrics do not produce similar results, and that the adoption of both methods in the Italian research assessment possibly introduced systematic and unknown biases in its final results. The conclusion reached by ANVUR must be reversed: the available evidence does not justify at all the joint use of IR and bibliometrics within the same research assessment exercise.

Suggested Citation

  • Alberto Baccini & Giuseppe De Nicolao, 2016. "Do they agree? Bibliometric evaluation versus informed peer review in the Italian research assessment exercise," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(3), pages 1651-1671, September.
  • Handle: RePEc:spr:scient:v:108:y:2016:i:3:d:10.1007_s11192-016-1929-y
    DOI: 10.1007/s11192-016-1929-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-016-1929-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-016-1929-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Bertocchi, Graziella & Gambardella, Alfonso & Jappelli, Tullio & Nappi, Carmela A. & Peracchi, Franco, 2015. "Bibliometric evaluation vs. informed peer review: Evidence from Italy," Research Policy, Elsevier, vol. 44(2), pages 451-466.
    2. Frederic S. Lee, 2007. "The Research Assessment Exercise, the state and the dominance of mainstream economics in British universities," Cambridge Journal of Economics, Oxford University Press, vol. 31(2), pages 309-325, March.
    3. Alessio Ancaiani & Alberto F. Anfossi & Anna Barbara & Sergio Benedetto & Brigida Blasi & Valentina Carletti & Tindaro Cicero & Alberto Ciolfi & Filippo Costa & Giovanna Colizza & Marco Costantini & F, 2015. "Evaluating scientific research in Italy: The 2004–10 research evaluation exercise," Research Evaluation, Oxford University Press, vol. 24(3), pages 242-255.
    4. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2015. "Predicting results of the research excellence framework using departmental h-index: revisited," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 1013-1017, September.
    5. O. Mryglod & R. Kenna & Yu. Holovatch & B. Berche, 2015. "Predicting results of the Research Excellence Framework using departmental h-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(3), pages 2165-2180, March.
    6. Michael E. D. Koenig, 1983. "Bibliometric indicators versus expert opinion in assessing research performance," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 34(2), pages 136-145, March.
    7. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    8. Giovanni Abramo & Ciriaco Andrea D'Angelo, 2015. "The VQR, Italy's second national research assessment: Methodological failures and ranking distortions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2202-2214, November.
    9. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    10. repec:mod:depeco:0020 is not listed on IDEAS
    11. Dag W Aksnes & Randi Elisabeth Taxt, 2004. "Peer reviews and bibliometric indicators: a comparative study at a Norwegian university," Research Evaluation, Oxford University Press, vol. 13(1), pages 33-41, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    2. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    3. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    4. Daniele Checchi & Alberto Ciolfi & Gianni De Fraja & Irene Mazzotta & Stefano Verzillo, 2021. "Have you Read This? An Empirical Comparison of the British REF Peer Review and the Italian VQR Bibliometric Algorithm," Economica, London School of Economics and Political Science, vol. 88(352), pages 1107-1129, October.
    5. Primož Južnič & Stojan Pečlin & Matjaž Žaucer & Tilen Mandelj & Miro Pušnik & Franci Demšar, 2010. "Scientometric indicators: peer-review, bibliometric methods and conflict of interests," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 429-441, November.
    6. Giovanni Abramo & Tindaro Cicero & Ciriaco Andrea D’Angelo, 2013. "National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 311-324, April.
    7. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    8. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    9. Li, Jiang & Sanderson, Mark & Willett, Peter & Norris, Michael & Oppenheim, Charles, 2010. "Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments," Journal of Informetrics, Elsevier, vol. 4(4), pages 554-563.
    10. Giovanni Abramo & Ciriaco Andrea D'Angelo, 2015. "The VQR, Italy's second national research assessment: Methodological failures and ranking distortions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2202-2214, November.
    11. Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
    12. Basso, Antonella & di Tollo, Giacomo, 2022. "Prediction of UK research excellence framework assessment by the departmental h-index," European Journal of Operational Research, Elsevier, vol. 296(3), pages 1036-1049.
    13. Fedderke, J.W. & Goldschmidt, M., 2015. "Does massive funding support of researchers work?: Evaluating the impact of the South African research chair funding initiative," Research Policy, Elsevier, vol. 44(2), pages 467-482.
    14. Takanori Ida & Naomi Fukuzawa, 2013. "Effects of large-scale research funding programs: a Japanese case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 1253-1273, March.
    15. Hui-Zhen Fu & Yuh-Shan Ho, 2013. "Comparison of independent research of China’s top universities using bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(1), pages 259-276, July.
    16. Shahd Al-Janabi & Lee Wei Lim & Luca Aquili, 2021. "Development of a tool to accurately predict UK REF funding allocation," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 8049-8062, September.
    17. Mike Thelwall, 2017. "Are Mendeley reader counts useful impact indicators in all fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(3), pages 1721-1731, December.
    18. James Tooley & Barrie Craven, 2018. "Private Sector Alternatives to the Research Excellence Framework for University League Tables," Economic Affairs, Wiley Blackwell, vol. 38(3), pages 434-443, October.
    19. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    20. Sigifredo Laengle & José M. Merigó & Nikunja Mohan Modak & Jian-Bo Yang, 2020. "Bibliometrics in operations research and management science: a university analysis," Annals of Operations Research, Springer, vol. 294(1), pages 769-813, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:108:y:2016:i:3:d:10.1007_s11192-016-1929-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.