IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v9y2015i3p529-541.html
   My bibliography  Save this article

The influence of time and discipline on the magnitude of correlations between citation counts and quality scores

Author

Listed:
  • Thelwall, Mike
  • Fairclough, Ruth

Abstract

Although various citation-based indicators are commonly used to help research evaluations, there are ongoing controversies about their value. In response, they are often correlated with quality ratings or with other quantitative indicators in order to partly assess their validity. When correlations are calculated for sets of publications from multiple disciplines or years, however, the magnitude of the correlation coefficient may be reduced, masking the strength of the underlying correlation. This article uses simulations to systematically investigate the extent to which mixing years or disciplines reduces correlations. The results show that mixing two sets of articles with different correlation strengths can reduce the correlation for the combined set to substantially below the average of the two. Moreover, even mixing two sets of articles with the same correlation strength but different mean citation counts can substantially reduce the correlation for the combined set. The extent of the reduction in correlation also depends upon whether the articles assessed have been pre-selected for being high quality and whether the relationship between the quality ratings and citation counts is linear or exponential. The results underline the importance of using homogeneous data sets but also help to interpret correlation coefficients when this is impossible.

Suggested Citation

  • Thelwall, Mike & Fairclough, Ruth, 2015. "The influence of time and discipline on the magnitude of correlations between citation counts and quality scores," Journal of Informetrics, Elsevier, vol. 9(3), pages 529-541.
  • Handle: RePEc:eee:infome:v:9:y:2015:i:3:p:529-541
    DOI: 10.1016/j.joi.2015.05.006
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S175115771520017X
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2015.05.006?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Nederhof, A. J. & van Raan, A. F. J., 1993. "A bibliometric analysis of six economics research groups: A comparison with peer review," Research Policy, Elsevier, vol. 22(4), pages 353-368, August.
    2. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    3. Didegah, Fereshteh & Thelwall, Mike, 2013. "Which factors help authors produce the highest impact research? Collaboration, journal and document properties," Journal of Informetrics, Elsevier, vol. 7(4), pages 861-873.
    4. Burrell, Quentin L., 2007. "Hirsch's h-index: A stochastic model," Journal of Informetrics, Elsevier, vol. 1(1), pages 16-25.
    5. Thelwall, Mike & Wilson, Paul, 2014. "Distributions for cited articles from individual subjects and years," Journal of Informetrics, Elsevier, vol. 8(4), pages 824-839.
    6. Ludo Waltman & Rodrigo Costas, 2014. "F1000 Recommendations as a Potential New Data Source for Research Evaluation: A Comparison With Citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 433-445, March.
    7. Rinia, E. J. & van Leeuwen, Th. N. & van Vuren, H. G. & van Raan, A. F. J., 1998. "Comparative analysis of a set of bibliometric indicators and central peer review criteria: Evaluation of condensed matter physics in the Netherlands," Research Policy, Elsevier, vol. 27(1), pages 95-107, May.
    8. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    9. William H. Starbuck, 2005. "How Much Better Are the Most-Prestigious Journals? The Statistics of Academic Publication," Organization Science, INFORMS, vol. 16(2), pages 180-200, April.
    10. Derek De Solla Price, 1976. "A general theory of bibliometric and other cumulative advantage processes," Journal of the American Society for Information Science, Association for Information Science & Technology, vol. 27(5), pages 292-306, September.
    11. Werner Marx & Lutz Bornmann, 2015. "On the causes of subject-specific citation rates in Web of Science," Scientometrics, Springer;Akadémiai Kiadó, vol. 102(2), pages 1823-1827, February.
    12. Franceschet, Massimo & Costantini, Antonio, 2011. "The first Italian research assessment exercise: A bibliometric perspective," Journal of Informetrics, Elsevier, vol. 5(2), pages 275-291.
    13. Emanuela Reale & Anna Barbara & Antonio Costantini, 2007. "Peer review for the evaluation of academic research: lessons from the Italian experience," Research Evaluation, Oxford University Press, vol. 16(3), pages 216-228, September.
    14. Giovanni Abramo & Tindaro Cicero & Ciriaco Andrea D’Angelo, 2013. "National peer-review research assessment exercises for the hard sciences can be a complete waste of money: the Italian case," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 311-324, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Yu, Dejian & Pan, Tianxing, 2021. "Tracing the main path of interdisciplinary research considering citation preference: A case from blockchain domain," Journal of Informetrics, Elsevier, vol. 15(2).
    2. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    3. Zoller, Daniel & Doerfel, Stephan & Jäschke, Robert & Stumme, Gerd & Hotho, Andreas, 2016. "Posted, visited, exported: Altmetrics in the social tagging system BibSonomy," Journal of Informetrics, Elsevier, vol. 10(3), pages 732-749.
    4. Yifan Qian & Wenge Rong & Nan Jiang & Jie Tang & Zhang Xiong, 2017. "Citation regression analysis of computer science publications in different ranking categories and subfields," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(3), pages 1351-1374, March.
    5. Hu, Zhigang & Tian, Wencan & Xu, Shenmeng & Zhang, Chunbo & Wang, Xianwen, 2018. "Four pitfalls in normalizing citation indicators: An investigation of ESI’s selection of highly cited papers," Journal of Informetrics, Elsevier, vol. 12(4), pages 1133-1145.
    6. Mike Thelwall, 2016. "Interpreting correlations between citation counts and other indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 337-347, July.
    7. Ashraf Maleki, 2022. "OCLC library holdings: assessing availability of academic books in libraries in print and electronic compared to citations and altmetrics," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 991-1020, February.
    8. Xie, Qing & Zhang, Xinyuan & Song, Min, 2021. "A network embedding-based scholar assessment indicator considering four facets: Research topic, author credit allocation, field-normalized journal impact, and published time," Journal of Informetrics, Elsevier, vol. 15(4).
    9. Ashraf Maleki, 2022. "Why does library holding format really matter for book impact assessment?: Modelling the relationship between citations and altmetrics with print and electronic holdings," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(2), pages 1129-1160, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Giovanni Abramo & Ciriaco Andrea D’Angelo & Emanuela Reale, 2019. "Peer review versus bibliometrics: Which method better predicts the scholarly impact of publications?," Scientometrics, Springer;Akadémiai Kiadó, vol. 121(1), pages 537-554, October.
    2. Thelwall, Mike & Wilson, Paul, 2014. "Regression for citation data: An evaluation of different methods," Journal of Informetrics, Elsevier, vol. 8(4), pages 963-971.
    3. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    4. Jacques Wainer & Michael Eckmann & Anderson Rocha, 2015. "Peer-Selected “Best Papers”—Are They Really That “Good”?," PLOS ONE, Public Library of Science, vol. 10(3), pages 1-12, March.
    5. Chi, Yuxue & Tang, Xianyi & Liu, Yijun, 2022. "Exploring the “awakening effect” in knowledge diffusion: a case study of publications in the library and information science domain," Journal of Informetrics, Elsevier, vol. 16(4).
    6. Wan Jing Low & Paul Wilson & Mike Thelwall, 2016. "Stopped sum models and proposed variants for citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(2), pages 369-384, May.
    7. Mike Thelwall, 2016. "Interpreting correlations between citation counts and other indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 108(1), pages 337-347, July.
    8. Jacques Wainer & Paula Vieira, 2013. "Correlations between bibliometrics and peer evaluation for all disciplines: the evaluation of Brazilian scientists," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 395-410, August.
    9. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    10. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    11. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    12. Bornmann, Lutz & Marx, Werner, 2015. "Methods for the generation of normalized citation impact scores in bibliometrics: Which method best reflects the judgements of experts?," Journal of Informetrics, Elsevier, vol. 9(2), pages 408-418.
    13. Thelwall, Mike, 2016. "The precision of the arithmetic mean, geometric mean and percentiles for citation data: An experimental simulation modelling approach," Journal of Informetrics, Elsevier, vol. 10(1), pages 110-123.
    14. Fairclough, Ruth & Thelwall, Mike, 2015. "More precise methods for national research citation impact comparisons," Journal of Informetrics, Elsevier, vol. 9(4), pages 895-906.
    15. Vieira, Elizabeth S. & Cabral, José A.S. & Gomes, José A.N.F., 2014. "How good is a model based on bibliometric indicators in predicting the final decisions made by peers?," Journal of Informetrics, Elsevier, vol. 8(2), pages 390-405.
    16. Mingyang Wang & Shi Li & Guangsheng Chen, 2017. "Detecting latent referential articles based on their vitality performance in the latest 2 years," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1557-1571, September.
    17. Thelwall, Mike, 2016. "The discretised lognormal and hooked power law distributions for complete citation data: Best options for modelling and regression," Journal of Informetrics, Elsevier, vol. 10(2), pages 336-346.
    18. Peiling Wang & Joshua Williams & Nan Zhang & Qiang Wu, 2020. "F1000Prime recommended articles and their citations: an exploratory study of four journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 933-955, February.
    19. Jianhua Hou & Xiucai Yang & Yang Zhang, 2023. "The effect of social media knowledge cascade: an analysis of scientific papers diffusion," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(9), pages 5169-5195, September.
    20. Franceschet, Massimo & Costantini, Antonio, 2010. "The effect of scholar collaboration on impact and quality of academic papers," Journal of Informetrics, Elsevier, vol. 4(4), pages 540-553.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:9:y:2015:i:3:p:529-541. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.