IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v101y2014i3d10.1007_s11192-014-1368-6.html
   My bibliography  Save this article

Comparing scientific performance among equals

Author

Listed:
  • C. O. S. Sorzano

    (National Center of Biotechnology (CSIC)
    University CEU San Pablo)

  • J. Vargas

    (National Center of Biotechnology (CSIC))

  • G. Caffarena-Fernández

    (University CEU San Pablo)

  • A. Iriarte

    (University CEU San Pablo)

Abstract

Measuring scientific performance is currently a common practice of funding agencies, fellowship evaluations and hiring institutions. However, as has already been recognized by many authors, comparing the performance in different scientific fields is a difficult task due to the different publication and citation patterns observed in each field. In this article, we defend that scientific performance of an individual scientist, laboratory or institution should be analysed within the corresponding context and we provide objective tools to perform this kind of comparative analysis. The usage of the new tools is illustrated by using two control groups, to which several performance measurements are referred: one group being the Physics and Chemistry Nobel laureates from 2007 to 2012, the other group consisting of a list of outstanding scientists affiliated to two different institutions.

Suggested Citation

  • C. O. S. Sorzano & J. Vargas & G. Caffarena-Fernández & A. Iriarte, 2014. "Comparing scientific performance among equals," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(3), pages 1731-1745, December.
  • Handle: RePEc:spr:scient:v:101:y:2014:i:3:d:10.1007_s11192-014-1368-6
    DOI: 10.1007/s11192-014-1368-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-014-1368-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-014-1368-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Igor Podlubny, 2005. "Comparison of scientific impact expressed by the number of citations in different fields of science," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(1), pages 95-99, July.
    2. Lutz Bornmann & Rüdiger Mutz & Hans‐Dieter Daniel, 2008. "Are there better indices for evaluation purposes than the h index? A comparison of nine different variants of the h index using data from biomedicine," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 59(5), pages 830-837, March.
    3. John Panaretos & Chrisovaladis Malesios, 2009. "Assessing scientific research performance and impact with single indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 635-670, December.
    4. Juan Imperial & Alonso Rodríguez-Navarro, 2007. "Usefulness of Hirsch’s h-index to evaluate scientific research in Spain," Scientometrics, Springer;Akadémiai Kiadó, vol. 71(2), pages 271-282, May.
    5. Juan E. Iglesias & Carlos Pecharromán, 2007. "Scaling the h-index for different scientific ISI fields," Scientometrics, Springer;Akadémiai Kiadó, vol. 73(3), pages 303-320, December.
    6. A. M. Geoffrion & J. S. Dyer & A. Feinberg, 1972. "An Interactive Approach for Multi-Criterion Optimization, with an Application to the Operation of an Academic Department," Management Science, INFORMS, vol. 19(4-Part-1), pages 357-368, December.
    7. Lundberg, Jonas, 2007. "Lifting the crown—citation z-score," Journal of Informetrics, Elsevier, vol. 1(2), pages 145-154.
    8. Sune Lehmann & Andrew D. Jackson & Benny E. Lautrup, 2008. "A quantitative analysis of indicators of scientific performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 76(2), pages 369-390, August.
    9. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Zoltán Néda & Levente Varga & Tamás S Biró, 2017. "Science and Facebook: The same popularity law!," PLOS ONE, Public Library of Science, vol. 12(7), pages 1-11, July.
    2. Yanan Wang & An Zeng & Ying Fan & Zengru Di, 2019. "Ranking scientific publications considering the aging characteristics of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(1), pages 155-166, July.
    3. Giovanni Abramo & Ciriaco Andrea D’Angelo, 2021. "A bibliometric methodology to unveil territorial inequities in the scientific wealth to combat COVID-19," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6601-6624, August.
    4. Robert J Ellis & Zhiyan Duan & Ye Wang, 2014. "Quantifying Auditory Temporal Stability in a Large Database of Recorded Music," PLOS ONE, Public Library of Science, vol. 9(12), pages 1-24, December.
    5. Justus Haucap & Johannes Muck, 2015. "What drives the relevance and reputation of economics journals? An update from a survey among economists," Scientometrics, Springer;Akadémiai Kiadó, vol. 103(3), pages 849-877, June.
    6. Jianlin Zhou & An Zeng & Ying Fan & Zengru Di, 2016. "Ranking scientific publications with similarity-preferential mechanism," Scientometrics, Springer;Akadémiai Kiadó, vol. 106(2), pages 805-816, February.
    7. Abramo, Giovanni & Aksnes, Dag W. & D’Angelo, Ciriaco Andrea, 2021. "Gender differences in research performance within and between countries: Italy vs Norway," Journal of Informetrics, Elsevier, vol. 15(2).
    8. Lee Stapleton, 2015. "Do academics doubt their own research?," SPRU Working Paper Series 2015-24, SPRU - Science Policy Research Unit, University of Sussex Business School.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kakushadze, Zura, 2016. "An index for SSRN downloads," Journal of Informetrics, Elsevier, vol. 10(1), pages 9-28.
    2. John Panaretos & Chrisovaladis Malesios, 2009. "Assessing scientific research performance and impact with single indices," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(3), pages 635-670, December.
    3. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    4. Lorna Wildgaard & Jesper W. Schneider & Birger Larsen, 2014. "A review of the characteristics of 108 author-level bibliometric indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 101(1), pages 125-158, October.
    5. Sangwal, Keshra, 2014. "Distributions of citations of papers of individual authors publishing in different scientific disciplines: Application of Langmuir-type function," Journal of Informetrics, Elsevier, vol. 8(4), pages 972-984.
    6. Deming Lin & Tianhui Gong & Wenbin Liu & Martin Meyer, 2020. "An entropy-based measure for the evolution of h index research," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(3), pages 2283-2298, December.
    7. Lutz Bornmann & Werner Marx, 2014. "How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 98(1), pages 487-509, January.
    8. Edgar D. Zanotto & Vinicius Carvalho, 2021. "Article age- and field-normalized tools to evaluate scientific impact and momentum," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 2865-2883, April.
    9. Giovanni Anania & Annarosa Caruso, 2013. "Two simple new bibliometric indexes to better evaluate research in disciplines where publications typically receive less citations," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 617-631, August.
    10. Maziar Montazerian & Edgar Dutra Zanotto & Hellmut Eckert, 2020. "Prolificacy and visibility versus reputation in the hard sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 123(1), pages 207-221, April.
    11. Rok Blagus & Brane L. Leskošek & Janez Stare, 2015. "Comparison of bibliometric measures for assessing relative importance of researchers," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 1743-1762, December.
    12. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    13. Bornmann, Lutz & Ganser, Christian & Tekles, Alexander, 2022. "Simulation of the h index use at university departments within the bibliometrics-based heuristics framework: Can the indicator be used to compare individual researchers?," Journal of Informetrics, Elsevier, vol. 16(1).
    14. Brandão, Luana Carneiro & Soares de Mello, João Carlos Correia Baptista, 2019. "A multi-criteria approach to the h-index," European Journal of Operational Research, Elsevier, vol. 276(1), pages 357-363.
    15. Qurat-ul Ain & Hira Riaz & Muhammad Tanvir Afzal, 2019. "Evaluation of h-index and its citation intensity based variants in the field of mathematics," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(1), pages 187-211, April.
    16. Anania, Giovanni & Caruso, Annarosa, 2012. "Two New Simple Bibliometric Indexes to Better Evaluate Research in Economics," 2012 First Congress, June 4-5, 2012, Trento, Italy 124116, Italian Association of Agricultural and Applied Economics (AIEAA).
    17. Hyeonchae Yang & Woo-Sung Jung, 2015. "A strategic management approach for Korean public research institutes based on bibliometric investigation," Quality & Quantity: International Journal of Methodology, Springer, vol. 49(4), pages 1437-1464, July.
    18. Bornmann, Lutz & Mutz, Rüdiger & Daniel, Hans-Dieter, 2010. "The h index research output measurement: Two approaches to enhance its accuracy," Journal of Informetrics, Elsevier, vol. 4(3), pages 407-414.
    19. Maziar Montazerian & Edgar Dutra Zanotto & Hellmut Eckert, 2019. "A new parameter for (normalized) evaluation of H-index: countries as a case study," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(3), pages 1065-1078, March.
    20. William Cabos & Juan Miguel Campanario, 2018. "Exploring the Hjif-Index, an Analogue to the H-Like Index for Journal Impact Factors," Publications, MDPI, vol. 6(2), pages 1-11, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:101:y:2014:i:3:d:10.1007_s11192-014-1368-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.