IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v9y2015i3p529-541.html
   My bibliography  Save this article

The influence of time and discipline on the magnitude of correlations between citation counts and quality scores

Author

Listed:
  • Thelwall, Mike
  • Fairclough, Ruth

Abstract

Although various citation-based indicators are commonly used to help research evaluations, there are ongoing controversies about their value. In response, they are often correlated with quality ratings or with other quantitative indicators in order to partly assess their validity. When correlations are calculated for sets of publications from multiple disciplines or years, however, the magnitude of the correlation coefficient may be reduced, masking the strength of the underlying correlation. This article uses simulations to systematically investigate the extent to which mixing years or disciplines reduces correlations. The results show that mixing two sets of articles with different correlation strengths can reduce the correlation for the combined set to substantially below the average of the two. Moreover, even mixing two sets of articles with the same correlation strength but different mean citation counts can substantially reduce the correlation for the combined set. The extent of the reduction in correlation also depends upon whether the articles assessed have been pre-selected for being high quality and whether the relationship between the quality ratings and citation counts is linear or exponential. The results underline the importance of using homogeneous data sets but also help to interpret correlation coefficients when this is impossible.

Suggested Citation

  • Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
  • Handle: RePEc:eee:infome:v:9:y:2015:i:3:p:529-541
    DOI: 10.1016/j.joi.2015.05.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115771520017X
    Download Restriction: Full text for ScienceDirect subscribers only
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ludo Waltman & Rodrigo Costas, 2014. "F1000 Recommendations as a Potential New Data Source for Research Evaluation: A Comparison With Citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 433-445, March.
    2. Nederhof, A. J. & van Raan, A. F. J., 1993. "A bibliometric analysis of six economics research groups: A comparison with peer review," Research Policy, Elsevier, vol. 22(4), pages 353-368, August.
    3. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    4. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    5. Didegah, Fereshteh & Thelwall, Mike, 2013. "Which factors help authors produce the highest impact research? Collaboration, journal and document properties," Journal of Informetrics, Elsevier, vol. 7(4), pages 861-873.
    6. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    7. Burrell, Quentin L., 2007. "Hirsch's h-index: A stochastic model," Journal of Informetrics, Elsevier, vol. 1(1), pages 16-25.
    8. Emanuela Reale & Anna Barbara & Antonio Costantini, 2007. "Peer review for the evaluation of academic research: lessons from the Italian experience," Research Evaluation, Oxford University Press, vol. 16(3), pages 216-228, September.
    9. Thelwall, Mike & Wilson, Paul, 2014. "Distributions for cited articles from individual subjects and years," Journal of Informetrics, Elsevier, vol. 8(4), pages 824-839.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yifan Qian & Wenge Rong & Nan Jiang & Jie Tang & Zhang Xiong, 2017. "Citation regression analysis of computer science publications in different ranking categories and subfields," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1351-1374, March.
    2. Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
    3. Mike Thelwall, 2016. "Interpreting correlations between citation counts and other indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 337-347, July.
    4. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    5. Zoller, Daniel & Doerfel, Stephan & Jäschke, Robert & Stumme, Gerd & Hotho, Andreas, 2016. "Posted, visited, exported: Altmetrics in the social tagging system BibSonomy," Journal of Informetrics, Elsevier, vol. 10(3), pages 732-749.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    2. Jacques Wainer & Michael Eckmann & Anderson Rocha, 2015. "Peer-Selected “Best Papers”—Are They Really That “Good”?," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-12, March.
    3. Mike Thelwall, 2016. "Interpreting correlations between citation counts and other indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 337-347, July.
    4. Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
    5. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    6. Thelwall, Mike & Wilson, Paul, 2014. "Regression for citation data: An evaluation of different methods," Journal of Informetrics, Elsevier, vol. 8(4), pages 963-971.
    7. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    8. Rebora, Gianfranco & Turri, Matteo, 2013. "The UK and Italian research assessment exercises face to face," Research Policy, Elsevier, vol. 42(9), pages 1657-1666.
    9. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    10. Wan Jing Low & Paul Wilson & Mike Thelwall, 2016. "Stopped sum models and proposed variants for citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 369-384, May.
    11. Robin Haunschild & Lutz Bornmann, 2018. "Field- and time-normalization of data with many zeros: an empirical analysis using citation and Twitter data," Scientometrics, Springer;Akadémiai Kiadó, vol. 116(2), pages 997-1012, August.
    12. Mike Thelwall & Paul Wilson, 2016. "Does research with statistics have more impact? The citation rank advantage of structural equation modeling," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(5), pages 1233-1244, May.
    13. Abramo, Giovanni & Cicero, Tindaro & D’Angelo, Ciriaco Andrea, 2011. "Assessing the varying level of impact measurement accuracy as a function of the citation window length," Journal of Informetrics, Elsevier, vol. 5(4), pages 659-667.
    14. Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
    15. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    16. Giovanni Abramo & Ciriaco Andrea D’Angelo & Flavia Di Costa, 2011. "National research assessment exercises: a comparison of peer review and bibliometrics rankings," Scientometrics, Springer;Akadémiai Kiadó, vol. 89(3), pages 929-941, December.
    17. Thelwall, Mike & Fairclough, Ruth, 2015. "Geometric journal impact factors correcting for individual highly cited articles," Journal of Informetrics, Elsevier, vol. 9(2), pages 263-272.
    18. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    19. Zoller, Daniel & Doerfel, Stephan & Jäschke, Robert & Stumme, Gerd & Hotho, Andreas, 2016. "Posted, visited, exported: Altmetrics in the social tagging system BibSonomy," Journal of Informetrics, Elsevier, vol. 10(3), pages 732-749.
    20. Abramo, Giovanni, 2018. "Revisiting the scientometric conceptualization of impact and its measurement," Journal of Informetrics, Elsevier, vol. 12(3), pages 590-597.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:9:y:2015:i:3:p:529-541. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Nithya Sathishkumar). General contact details of provider: http://www.elsevier.com/locate/joi .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.