IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v75y2008i1d10.1007_s11192-007-1832-7.html
   My bibliography  Save this article

Calibrating the zoom — a test of Zitt’s hypothesis

Author

Listed:
  • Jonathan Adams

    (Evidence Ltd.)

  • Karen Gurney

    (Evidence Ltd.)

  • Louise Jackson

    (Evidence Ltd.)

Abstract

Bibliometric indicators are widely used to compare performance between units operating in different fields of science. For cross-field comparisons, article citation rates have to be normalised to baseline values because citation practices vary between fields, in respect of timing and volume. Baseline citation values vary according to the level at which articles are aggregated (journal, sub-field, field). Consequently, the normalised citation performance of each research unit will depend on the level of aggregation, or ‘zoom’, that was used when the baselines were calculated. Here, we calculate the citation performance of UK research units for each of three levels of article-aggregation. We then compare this with the grade awarded to that unit by external peer review. We find that the correlation between average normalised citation impact and peerreviewed grade does indeed vary according to the selected level of zoom. The possibility that the level of ‘zoom’ will affect our assessment of relative impact is an important insight. The fact that more than one view and hence more than one interpretation of performance might exist would need to be taken into account in any evaluation methodology. This is likely to be a serious challenge unless a reference indicator is available and will generally require any evaluation to be carried out at multiple levels for a reflective review.

Suggested Citation

  • Jonathan Adams & Karen Gurney & Louise Jackson, 2008. "Calibrating the zoom — a test of Zitt’s hypothesis," Scientometrics, Springer;Akadémiai Kiadó, vol. 75(1), pages 81-95, April.
  • Handle: RePEc:spr:scient:v:75:y:2008:i:1:d:10.1007_s11192-007-1832-7
    DOI: 10.1007/s11192-007-1832-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-007-1832-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-007-1832-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Igor Podlubny, 2005. "Comparison of scientific impact expressed by the number of citations in different fields of science," Scientometrics, Springer;Akadémiai Kiadó, vol. 64(1), pages 95-99, July.
    2. Wolfgang Glänzel & Henk F. Moed, 2002. "Journal impact measures in bibliometric research," Scientometrics, Springer;Akadémiai Kiadó, vol. 53(2), pages 171-193, February.
    3. Ronald N. Kostoff, 2002. "Citation analysis of research performer quality," Scientometrics, Springer;Akadémiai Kiadó, vol. 53(1), pages 49-71, January.
    4. Jonathan Adams, 1998. "Benchmarking international research," Nature, Nature, vol. 396(6712), pages 615-618, December.
    5. Michel Zitt & Suzy Ramanana-Rahary & Elise Bassecoulard, 2005. "Relativity of citation performance and excellence measures: From cross-field to cross-scale effects of field-normalisation," Scientometrics, Springer;Akadémiai Kiadó, vol. 63(2), pages 373-401, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "Globalised vs averaged: Bias and ranking performance on the author level," Journal of Informetrics, Elsevier, vol. 13(1), pages 299-313.
    2. Ludo Waltman & Nees Jan Eck, 2013. "Source normalized indicators of citation impact: an overview of different approaches and an empirical comparison," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(3), pages 699-716, September.
    3. Adams, Jonathan, 2018. "Information and misinformation in bibliometric time-trend analysis," Journal of Informetrics, Elsevier, vol. 12(4), pages 1063-1071.
    4. V. A. Traag & L. Waltman, 2019. "Systematic analysis of agreement between metrics and peer review in the UK REF," Palgrave Communications, Palgrave Macmillan, vol. 5(1), pages 1-12, December.
    5. Dunaiski, Marcel & Geldenhuys, Jaco & Visser, Willem, 2019. "On the interplay between normalisation, bias, and performance of paper impact metrics," Journal of Informetrics, Elsevier, vol. 13(1), pages 270-290.
    6. Wolfgang Glänzel & Bart Thijs, 2018. "The role of baseline granularity for benchmarking citation impact. The case of CSS profiles," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 521-536, July.
    7. Wolfgang Glänzel & Bart Thijs & András Schubert & Koenraad Debackere, 2009. "Subfield-specific normalized relative indicators and a new generation of relational charts: Methodological foundations illustrated on the assessment of institutional research performance," Scientometrics, Springer;Akadémiai Kiadó, vol. 78(1), pages 165-188, January.
    8. T. S. Evans & N. Hopkins & B. S. Kaube, 2012. "Universality of performance indicators based on citation and reference counts," Scientometrics, Springer;Akadémiai Kiadó, vol. 93(2), pages 473-495, November.
    9. Ludo Waltman & Erjia Yan & Nees Jan Eck, 2011. "A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 301-314, October.
    10. Theresa Velden & Asif-ul Haque & Carl Lagoze, 2010. "A new approach to analyzing patterns of collaboration in co-authorship networks: mesoscopic analysis and interpretation," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(1), pages 219-242, October.
    11. Vaccario, Giacomo & Medo, Matúš & Wider, Nicolas & Mariani, Manuel Sebastian, 2017. "Quantifying and suppressing ranking bias in a large citation network," Journal of Informetrics, Elsevier, vol. 11(3), pages 766-782.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. José María Gómez-Sancho & María Jesús Mancebón-Torrubia, 2009. "The evaluation of scientific production: Towards a neutral impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(2), pages 435-458, November.
    2. Bar-Ilan, Judit, 2008. "Informetrics at the beginning of the 21st century—A review," Journal of Informetrics, Elsevier, vol. 2(1), pages 1-52.
    3. Waltman, Ludo, 2016. "A review of the literature on citation impact indicators," Journal of Informetrics, Elsevier, vol. 10(2), pages 365-391.
    4. José María Gómez-Sancho & María Jesús Mancebón-Torrubia, 2010. "A new approach to measuring scientific production in JCR journals and its application to Spanish public universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(1), pages 271-293, October.
    5. Bárbara S. Lancho-Barrantes & Vicente P. Guerrero-Bote & Félix Moya-Anegón, 2010. "The iceberg hypothesis revisited," Scientometrics, Springer;Akadémiai Kiadó, vol. 85(2), pages 443-461, November.
    6. Beirlant, Jan & Glänzel, Wolfgang & Carbonez, An & Leemans, Herlinde, 2007. "Scoring research output using statistical quantile plotting," Journal of Informetrics, Elsevier, vol. 1(3), pages 185-192.
    7. M. Zitt, 2011. "Behind citing-side normalization of citations: some properties of the journal impact factor," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(1), pages 329-344, October.
    8. Zitt, Michel, 2010. "Citing-side normalization of journal impact: A robust variant of the Audience Factor," Journal of Informetrics, Elsevier, vol. 4(3), pages 392-406.
    9. Tolga Yuret, 2018. "Author-weighted impact factor and reference return ratio: can we attain more equality among fields?," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(3), pages 2097-2111, September.
    10. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    11. Lutz Bornmann & Klaus Wohlrabe, 2019. "Normalisation of citation impact in economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 841-884, August.
    12. Colliander, Cristian & Ahlgren, Per, 2011. "The effects and their stability of field normalization baseline on relative performance with respect to citation impact: A case study of 20 natural science departments," Journal of Informetrics, Elsevier, vol. 5(1), pages 101-113.
    13. Michel Zitt, 2012. "The journal impact factor: angel, devil, or scapegoat? A comment on J.K. Vanclay’s article 2011," Scientometrics, Springer;Akadémiai Kiadó, vol. 92(2), pages 485-503, August.
    14. Jonathan Adams & Karen Gurney & Stuart Marshall, 2007. "Profiling citation impact: A new methodology," Scientometrics, Springer;Akadémiai Kiadó, vol. 72(2), pages 325-344, August.
    15. Wolfgang Glänzel & Bart Thijs, 2018. "The role of baseline granularity for benchmarking citation impact. The case of CSS profiles," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(1), pages 521-536, July.
    16. Alonso, S. & Cabrerizo, F.J. & Herrera-Viedma, E. & Herrera, F., 2009. "h-Index: A review focused in its variants, computation and standardization for different scientific fields," Journal of Informetrics, Elsevier, vol. 3(4), pages 273-289.
    17. Stephen Carley & Alan L. Porter, 2012. "A forward diversity index," Scientometrics, Springer;Akadémiai Kiadó, vol. 90(2), pages 407-427, February.
    18. Lawson, Cornelia & Geuna, Aldo & Ana Fernández-Zubieta & Toselli, Manuel & Kataishi, Rodrigo, 2015. "International Careers of Researchers in Biomedical Sciences: A Comparison of the US and the UK," Department of Economics and Statistics Cognetti de Martiis. Working Papers 201514, University of Turin.
    19. Hyeonchae Yang & Woo-Sung Jung, 2015. "A strategic management approach for Korean public research institutes based on bibliometric investigation," Quality & Quantity: International Journal of Methodology, Springer, vol. 49(4), pages 1437-1464, July.
    20. Seongkyoon Jeong & Jong-Chan Kim & Jae Young Choi, 2015. "Technology convergence: What developmental stage are we in?," Scientometrics, Springer;Akadémiai Kiadó, vol. 104(3), pages 841-871, September.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:75:y:2008:i:1:d:10.1007_s11192-007-1832-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.