IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v96y2013i2d10.1007_s11192-013-0969-9.html
   My bibliography  Save this article

Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists

Author

Listed:
  • Jacques Wainer

    (Institute of Computing—UNICAMP)

  • Paula Vieira

    (Institute of Computing—UNICAMP)

Abstract

This paper correlates the peer evaluations performed in late 2009 by the disciplinary committees of CNPq (a Brazilian funding agency) with some standard bibliometric measures for 55 scientific areas. We compared the decisions to increase, maintain or decrease a scientist’s research scholarship funded by CNPq. We analyzed these decisions for 2,663 Brazilian scientists and computed their correlations (Spearman rho) with 21 different measures, among them: total production, production in the last 5 years, production indexed in Web of Science and Scopus, total citations received (according to WOS, Scopus, and Google Scholar), h-index and m-quotient (according to the three citation services). The highest correlations for each area range from 0.95 to 0.29, although there are areas with no significantly positive correlation with any of the metrics.

Suggested Citation

  • Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
  • Handle: RePEc:spr:scient:v:96:y:2013:i:2:d:10.1007_s11192-013-0969-9
    DOI: 10.1007/s11192-013-0969-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-013-0969-9
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-013-0969-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lutz Bornmann & Rüdiger Mutz & Hans‐Dieter Daniel, 2008. "Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(5), pages 830-837, March.
    2. Juan E. Iglesias & Carlos Pecharromán, 2007. "Scaling the h-index for different scientific ISI fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 73(3), pages 303-320, December.
    3. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    4. Michael S. Patterson & Simon Harris, 2009. "The relationship between reviewers’ quality-scores and number of citations for papers published in the journal Physics in Medicine and Biology from 2003–2005," Scientometrics, Springer;Akadémiai Kiadó, vol. 80(2), pages 343-349, August.
    5. Anthony F. J. Raan, 2006. "Comparison of the Hirsch-index with standard bibliometric indicators and with peer judgment for 147 chemistry research groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(3), pages 491-502, June.
    6. Emanuela Reale & Anna Barbara & Antonio Costantini, 2007. "Peer review for the evaluation of academic research: lessons from the Italian experience," Research Evaluation, Oxford University Press, vol. 16(3), pages 216-228, September.
    7. Michel Zitt & Suzy Ramanana-Rahary & Elise Bassecoulard, 2005. "Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 373-401, April.
    8. Eduardo A. Oliveira & Enrico A. Colosimo & Daniella R. Martelli & Isabel G. Quirino & Maria Christina L. Oliveira & Leonardo S. Lima & Ana Cristina Simões e Silva & Hercílio Martelli-Júnior, 2012. "Comparison of Brazilian researchers in clinical medicine: are criteria for ranking well-adjusted?," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 429-443, February.
    9. Ludo Waltman & Nees Jan Eck & Thed N. Leeuwen & Martijn S. Visser & Anthony F. J. Raan, 2011. "On the correlation between bibliometric indicators and peer review: reply to Opthof and Leydesdorff," Scientometrics, Springer;Akadémiai Kiadó, vol. 88(3), pages 1017-1022, September.
    10. Li, Jiang & Sanderson, Mark & Willett, Peter & Norris, Michael & Oppenheim, Charles, 2010. "Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments," Journal of Informetrics, Elsevier, vol. 4(4), pages 554-563.
    11. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    12. Lutz Bornmann & Hans-Dieter Daniel, 2005. "Does the h-index for ranking of scientists really work?," Scientometrics, Springer;Akadémiai Kiadó, vol. 65(3), pages 391-392, December.
    13. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    14. Dag W Aksnes & Randi Elisabeth Taxt, 2004. "Peer reviews and bibliometric indicators: a comparative study at a Norwegian university," Research Evaluation, Oxford University Press, vol. 13(1), pages 33-41, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lutz Bornmann, 2015. "Alternative metrics in scientometrics: a meta-analysis of research into three altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(3), pages 1123-1144, June.
    2. Fabio Zagonari, 2019. "Scientific Production and Productivity for Characterizing an Author’s Publication History: Simple and Nested Gini’s and Hirsch’s Indexes Combined," Publications, MDPI, vol. 7(2), pages 1-30, May.
    3. Mike Thelwall, 2016. "Interpreting correlations between citation counts and other indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 337-347, July.
    4. Giovanni Abramo & Ciriaco D’Angelo, 2015. "An assessment of the first “scientific habilitation” for university appointments in Italy," Economia Politica: Journal of Analytical and Institutional Economics, Springer;Fondazione Edison, vol. 32(3), pages 329-357, December.
    5. Jacques Wainer & Michael Eckmann & Anderson Rocha, 2015. "Peer-Selected “Best Papers”—Are They Really That “Good”?," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-12, March.
    6. Lawrence Smolinsky & Daniel S. Sage & Aaron J. Lercher & Aaron Cao, 2021. "Citations versus expert opinions: citation analysis of featured reviews of the American Mathematical Society," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(5), pages 3853-3870, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    2. Li, Jiang & Sanderson, Mark & Willett, Peter & Norris, Michael & Oppenheim, Charles, 2010. "Ranking of library and information science researchers: Comparison of data sources for correlating citation data, and expert judgments," Journal of Informetrics, Elsevier, vol. 4(4), pages 554-563.
    3. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    4. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    5. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    6. Jacques Wainer & Michael Eckmann & Anderson Rocha, 2015. "Peer-Selected “Best Papers”—Are They Really That “Good”?," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-12, March.
    7. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    8. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    9. J. W. Fedderke, 2013. "The objectivity of national research foundation peer review in South Africa assessed against bibliometric indexes," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 177-206, November.
    10. Mingers, John & Leydesdorff, Loet, 2015. "A review of theory and practice in scientometrics," European Journal of Operational Research, Elsevier, vol. 246(1), pages 1-19.
    11. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.
    12. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    13. Giovanni Abramo & Tindaro Cicero & Ciriaco Andrea D’Angelo, 2013. "National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 311-324, April.
    14. Christopher McCarty & James W. Jawitz & Allison Hopkins & Alex Goldman, 2013. "Predicting author h-index using characteristics of the co-author network," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 467-483, August.
    15. Fedderke, J.W. & Goldschmidt, M., 2015. "Does massive funding support of researchers work?: Evaluating the impact of the South African research chair funding initiative," Research Policy, Elsevier, vol. 44(2), pages 467-482.
    16. Giovanni Abramo & Ciriaco Andrea D'Angelo, 2015. "The VQR, Italy's second national research assessment: Methodological failures and ranking distortions," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(11), pages 2202-2214, November.
    17. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    18. Hyeonchae Yang & Woo-Sung Jung, 2015. "A strategic management approach for Korean public research institutes based on bibliometric investigation," Quality & Quantity: International Journal of Methodology, Springer, vol. 49(4), pages 1437-1464, July.
    19. Franceschini, Fiorenzo & Maisano, Domenico, 2011. "Structured evaluation of the scientific output of academic research groups by recent h-based indicators," Journal of Informetrics, Elsevier, vol. 5(1), pages 64-74.
    20. Serge Galam, 2011. "Tailor based allocations for multiple authorship: a fractional gh-index," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 365-379, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:96:y:2013:i:2:d:10.1007_s11192-013-0969-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.