IDEAS home Printed from https://ideas.repec.org/p/pra/mprapa/50470.html
   My bibliography  Save this paper

Metodi bibliometrici e revisione dei pari per la valutazione della ricerca: un confronto metodologico
[Bibliometric and peer review methods for research evaluation: a methodological appraisement]

Author

Listed:
  • Cicero, Tindaro
  • Malgarini, Marco
  • Nappi, Carmela Anna
  • Peracchi, Franco

Abstract

The Italian Research Evaluation exercise for the period 2004-2010 has analyzed almost 185,000 among articles, books, patents and other scientific products submitted by Italian Universities and other public research bodies. In most cases, scientific publications have been peer reviewed; however, in hard sciences, medicines, engineering and economics, bibliometric indicators have also been used. For those areas, we have extracted a representative sample of scientific products, equal to the 10% of the reference population of submitted products, to be evaluated both with peer review and biblometric methods. Our analysis shows that peer review and bibliometric evaluations exhibit a level of concordance higher than that observed among two different reviewers of the same article. In almost any scientific discipline, however, there is a systematic difference among peer and bibliometric evaluations: more specifically, bibliometric scores are on average significantly higher than those obtained with the peer review. Overall, our results obtained fully support the choice adopted in the Italian exercise of using both evaluation techniques in order to assess the quality of Italian research institutions.

Suggested Citation

  • Cicero, Tindaro & Malgarini, Marco & Nappi, Carmela Anna & Peracchi, Franco, 2013. "Metodi bibliometrici e revisione dei pari per la valutazione della ricerca: un confronto metodologico [Bibliometric and peer review methods for research evaluation: a methodological appraisement]," MPRA Paper 50470, University Library of Munich, Germany.
  • Handle: RePEc:pra:mprapa:50470
    as

    Download full text from publisher

    File URL: https://mpra.ub.uni-muenchen.de/50470/1/MPRA_paper_50470.pdf
    File Function: original version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bornmann, Lutz & Leydesdorff, Loet & Wang, Jian, 2014. "How to improve the prediction based on citation impact percentiles for years shortly after the publication date?," Journal of Informetrics, Elsevier, vol. 8(1), pages 175-180.
    2. Jianhua Hou & Xiucai Yang & Yang Zhang, 2023. "The effect of social media knowledge cascade: an analysis of scientific papers diffusion," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5169-5195, September.
    3. Fei Shu, 2017. "Comment to: Does China need to rethink its metrics- and citation-based research rewards policies?," Scientometrics, Springer;Akadémiai Kiadó, vol. 113(2), pages 1229-1231, November.
    4. Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
    5. Chi, Yuxue & Tang, Xianyi & Liu, Yijun, 2022. "Exploring the “awakening effect” in knowledge diffusion: a case study of publications in the library and information science domain," Journal of Informetrics, Elsevier, vol. 16(4).
    6. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    7. B Ian Hutchins & Xin Yuan & James M Anderson & George M Santangelo, 2016. "Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level," PLOS Biology, Public Library of Science, vol. 14(9), pages 1-25, September.
    8. Loet Leydesdorff & Lutz Bornmann & Jonathan Adams, 2019. "The integrated impact indicator revisited (I3*): a non-parametric alternative to the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(3), pages 1669-1694, June.
    9. Saeideh Ebrahimy & Jafar Mehrad & Fatemeh Setareh & Massoud Hosseinchari, 2016. "Path analysis of the relationship between visibility and citation: the mediating roles of save, discussion, and recommendation metrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1497-1510, December.
    10. Bouyssou, Denis & Marchant, Thierry, 2014. "An axiomatic approach to bibliometric rankings and indices," Journal of Informetrics, Elsevier, vol. 8(3), pages 449-477.
    11. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    12. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    13. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    14. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    15. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    16. Torger Möller & Marion Schmidt & Stefan Hornbostel, 2016. "Assessing the effects of the German Excellence Initiative with bibliometric methods," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 2217-2239, December.
    17. Zohreh Zahedi & Rodrigo Costas & Paul Wouters, 2014. "How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(2), pages 1491-1513, November.
    18. Bornmann, Lutz & Haunschild, Robin, 2016. "Normalization of Mendeley reader impact on the reader- and paper-side: A comparison of the mean discipline normalized reader score (MDNRS) with the mean normalized reader score (MNRS) and bare reader ," Journal of Informetrics, Elsevier, vol. 10(3), pages 776-788.
    19. Fei Shu & Wen Lou & Stefanie Haustein, 2018. "Can Twitter increase the visibility of Chinese publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 505-519, July.
    20. Hou, Jianhua & Yang, Xiucai, 2020. "Social media-based sleeping beauties: Defining, identifying and features," Journal of Informetrics, Elsevier, vol. 14(2).

    More about this item

    Keywords

    Indicatori bibliometrici; revisione dei pari; valutazione della ricerca;
    All these keywords.

    JEL classification:

    • I20 - Health, Education, and Welfare - - Education - - - General
    • I23 - Health, Education, and Welfare - - Education - - - Higher Education; Research Institutions

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pra:mprapa:50470. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joachim Winter (email available below). General contact details of provider: https://edirc.repec.org/data/vfmunde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.