IDEAS home Printed from https://ideas.repec.org/a/gam/jpubli/v8y2020i2p34-d373852.html
   My bibliography  Save this article

An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators

Author

Listed:
  • Boris Forthmann

    (Institute of Psychology in Education, University of Münster, 48149 Münster, Germany)

  • Mark A. Runco

    (Innovation and Leadership Faculty, Southern Oregon University, Medford, OR 97501, USA)

Abstract

Quantifying the creative quality of scholarly work is a difficult challenge, and, unsurprisingly, empirical research in this area is scarce. This investigation builds on the theoretical distinction between impact (e.g., citation counts) and creative quality (e.g., originality) and extends recent work on using objective measures to assess the originality of scientific publications. Following extensive evidence from creativity research and theoretical deliberations, we operationalized multiple indicators of openness and idea density for bibliometric research. Results showed that in two large bibliometric datasets (creativity research: N = 1643; bibliometrics dataset: N = 2986) correlations between impact and the various indicators for openness, idea density, and originality were negligible to small; this finding supports the discriminant validity of the new creative scholarship indicators. The convergent validity of these indicators was not as clear, but correlations were comparable to previous research on bibliometric originality. Next, we explored the nomological net of various operationalizations of openness and idea density by means of exploratory graph analysis. The openness indicators of variety (based on cited journals and cited first authors) were found to be made up of strongly connected nodes in a separate cluster; the idea density indicators (those based on abstracts or titles of scientific work) also formed a separate cluster. Based on these findings, we discuss the problems arising from the potential methodological overlap among indicators and we offer future directions for bibliometric explorations of the creative quality of scientific publications.

Suggested Citation

  • Boris Forthmann & Mark A. Runco, 2020. "An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators," Publications, MDPI, vol. 8(2), pages 1-16, June.
  • Handle: RePEc:gam:jpubli:v:8:y:2020:i:2:p:34-:d:373852
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2304-6775/8/2/34/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2304-6775/8/2/34/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Aria, Massimo & Cuccurullo, Corrado, 2017. "bibliometrix: An R-tool for comprehensive science mapping analysis," Journal of Informetrics, Elsevier, vol. 11(4), pages 959-975.
    2. James Hartley, 2017. "Authors and their citations: a point of view," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(2), pages 1081-1084, February.
    3. van Buuren, Stef & Groothuis-Oudshoorn, Karin, 2011. "mice: Multivariate Imputation by Chained Equations in R," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 45(i03).
    4. Wang, Jian, 2016. "Knowledge creation in collaboration networks: Effects of tie configuration," Research Policy, Elsevier, vol. 45(1), pages 68-80.
    5. Brown, Ld & Gardner, Jc, 1985. "Using Citation Analysis To Assess The Impact Of Journals And Articles On Contemporary Accounting Research (Car)," Journal of Accounting Research, Wiley Blackwell, vol. 23(1), pages 84-109.
    6. Wolfgang Glänzel & Henk F. Moed, 2002. "Journal impact measures in bibliometric research," Scientometrics, Springer;Akadémiai Kiadó, vol. 53(2), pages 171-193, February.
    7. Andy Stirling, 2007. "A General Framework for Analysing Diversity in Science, Technology and Society," SPRU Working Paper Series 156, SPRU - Science Policy Research Unit, University of Sussex Business School.
    8. Adriano S. Melo & Luis Mauricio Bini & Priscilla Carvalho, 2006. "Brazilian articles in international journals on Limnology," Scientometrics, Springer;Akadémiai Kiadó, vol. 67(2), pages 187-199, May.
    9. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    10. Lin Zhang & Ronald Rousseau & Wolfgang Glänzel, 2016. "Diversity of references as an indicator of the interdisciplinarity of journals: Taking similarity between subject fields into account," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(5), pages 1257-1265, May.
    11. Hudson F Golino & Sacha Epskamp, 2017. "Exploratory graph analysis: A new approach for estimating the number of dimensions in psychological research," PLOS ONE, Public Library of Science, vol. 12(6), pages 1-26, June.
    12. Sanjay K. Arora & Alan L. Porter & Jan Youtie & Philip Shapira, 2013. "Capturing new developments in an emerging technology: an updated search strategy for identifying nanotechnology research outputs," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(1), pages 351-370, April.
    13. Franceschet, Massimo & Costantini, Antonio, 2010. "The effect of scholar collaboration on impact and quality of academic papers," Journal of Informetrics, Elsevier, vol. 4(4), pages 540-553.
    14. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    15. Golino, Hudson F. & Demetriou, Andreas, 2017. "Estimating the dimensionality of intelligence like data using Exploratory Graph Analysis," Intelligence, Elsevier, vol. 62(C), pages 54-70.
    16. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    17. Freese, Jeremy & Peterson, David, 2017. "Replication in Social Science," SocArXiv 5bck9, Center for Open Science.
    18. Thomas Heinze & Gerrit Bauer, 2007. "Characterizing creative scientists in nano-S&T: Productivity, multidisciplinarity, and network brokerage in a longitudinal perspective," Scientometrics, Springer;Akadémiai Kiadó, vol. 70(3), pages 811-830, March.
    19. Mortaza Jamshidian & Siavash Jalal, 2010. "Tests of Homoscedasticity, Normality, and Missing Completely at Random for Incomplete Multivariate Data," Psychometrika, Springer;The Psychometric Society, vol. 75(4), pages 649-674, December.
    20. Forthmann, Boris & Jendryczko, David & Scharfen, Jana & Kleinkorres, Ruben & Benedek, Mathias & Holling, Heinz, 2019. "Creative ideation, broad retrieval ability, and processing speed: A confirmatory study of nested cognitive abilities," Intelligence, Elsevier, vol. 75(C), pages 59-72.
    21. Thomas Heinze, 2013. "Creative accomplishments in science: definition, theoretical considerations, examples from science history, and bibliometric findings," Scientometrics, Springer;Akadémiai Kiadó, vol. 95(3), pages 927-940, June.
    22. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    23. András Schubert, 2010. "A reference-based Hirschian similarity measure for journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 84(1), pages 133-147, July.
    24. Alfredo Yegros-Yegros & Ismael Rafols & Pablo D’Este, 2015. "Does Interdisciplinary Research Lead to Higher Citation Impact? The Different Effect of Proximal and Distal Interdisciplinarity," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-21, August.
    25. Jesper W. Schneider, 2018. "NHST is still logically flawed," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 627-635, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    2. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    3. Shiji Chen & Yanhui Song & Fei Shu & Vincent Larivière, 2022. "Interdisciplinarity and impact: the effects of the citation time window," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2621-2642, May.
    4. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    5. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    6. Lina Xu & Steven Dellaportas & Zhiqiang Yang & Jin Wang, 2023. "More on the relationship between interdisciplinary accounting research and citation impact," Accounting and Finance, Accounting and Finance Association of Australia and New Zealand, vol. 63(4), pages 4779-4803, December.
    7. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Nov 2023.
    8. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    9. Wolfgang Glänzel & Koenraad Debackere, 2022. "Various aspects of interdisciplinarity in research and how to quantify and measure those," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(9), pages 5551-5569, September.
    10. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    11. Fontana, Magda & Iori, Martina & Leone Sciabolazza, Valerio & Souza, Daniel, 2022. "The interdisciplinarity dilemma: Public versus private interests," Research Policy, Elsevier, vol. 51(7).
    12. Alfonso Ávila-Robinson & Cristian Mejia & Shintaro Sengoku, 2021. "Are bibliometric measures consistent with scientists’ perceptions? The case of interdisciplinarity in research," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(9), pages 7477-7502, September.
    13. Fei Shu & Jesse David Dinneen & Shiji Chen, 2022. "Measuring the disparity among scientific disciplines using Library of Congress Subject Headings," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(6), pages 3613-3628, June.
    14. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    15. Hongyu Zhou & Raf Guns & Tim C. E. Engels, 2022. "Are social sciences becoming more interdisciplinary? Evidence from publications 1960–2014," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(9), pages 1201-1221, September.
    16. Xuefeng Wang & Zhinan Wang & Ying Huang & Yun Chen & Yi Zhang & Huichao Ren & Rongrong Li & Jinhui Pang, 2017. "Measuring interdisciplinarity of a research system: detecting distinction between publication categories and citation categories," Scientometrics, Springer;Akadémiai Kiadó, vol. 111(3), pages 2023-2039, June.
    17. Chen, Shiji & Qiu, Junping & Arsenault, Clément & Larivière, Vincent, 2021. "Exploring the interdisciplinarity patterns of highly cited papers," Journal of Informetrics, Elsevier, vol. 15(1).
    18. Higham, Kyle & Contisciani, Martina & De Bacco, Caterina, 2022. "Multilayer patent citation networks: A comprehensive analytical framework for studying explicit technological relationships," Technological Forecasting and Social Change, Elsevier, vol. 179(C).
    19. Liu, Meijun & Jaiswal, Ajay & Bu, Yi & Min, Chao & Yang, Sijie & Liu, Zhibo & Acuña, Daniel & Ding, Ying, 2022. "Team formation and team impact: The balance between team freshness and repeat collaboration," Journal of Informetrics, Elsevier, vol. 16(4).
    20. Jingjing Ren & Fang Wang & Minglu Li, 2023. "Dynamics and characteristics of interdisciplinary research in scientific breakthroughs: case studies of Nobel-winning research in the past 120 years," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4383-4419, August.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jpubli:v:8:y:2020:i:2:p:34-:d:373852. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.