IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v122y2020i1d10.1007_s11192-019-03263-0.html
   My bibliography  Save this article

Measuring originality in science

Author

Listed:
  • Sotaro Shibayama

    (Lund University)

  • Jian Wang

    (Leiden University)

Abstract

Originality has self-evident importance for science, but objectively measuring originality poses a formidable challenge. We conceptualise originality as the degree to which a scientific discovery provides subsequent studies with unique knowledge that is not available from previous studies. Accordingly, we operationalise a new measure of originality for individual scientific papers building on the network betweenness centrality concept. Specifically, we measure the originality of a paper based on the directed citation network between its references and the subsequent papers citing it. We demonstrate the validity of this measure using survey information. In particular, we find that the proposed measure is positively correlated with the self-assessed theoretical originality but not with the methodological originality. We also find that originality can be reliably measured with only a small number of subsequent citing papers, which lowers computational cost and contributes to practical utility. The measure also predicts future citations, further confirming its validity. We further characterise the measure to guide its future use.

Suggested Citation

  • Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
  • Handle: RePEc:spr:scient:v:122:y:2020:i:1:d:10.1007_s11192-019-03263-0
    DOI: 10.1007/s11192-019-03263-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-019-03263-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-019-03263-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    2. Naoki Shibata & Yuya Kajikawa & Katsumori Matsushima, 2007. "Topological analysis of citation networks to discover the future core articles," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(6), pages 872-882, April.
    3. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    4. Wang, Jian, 2016. "Knowledge creation in collaboration networks: Effects of tie configuration," Research Policy, Elsevier, vol. 45(1), pages 68-80.
    5. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    6. Hall, B. & Jaffe, A. & Trajtenberg, M., 2001. "The NBER Patent Citations Data File: Lessons, Insights and Methodological Tools," Papers 2001-29, Tel Aviv.
    7. Trapido, Denis, 2015. "How novelty in knowledge earns recognition: The role of consistent identities," Research Policy, Elsevier, vol. 44(8), pages 1488-1500.
    8. Andy Stirling, 2007. "A General Framework for Analysing Diversity in Science, Technology and Society," SPRU Working Paper Series 156, SPRU - Science Policy Research Unit, University of Sussex Business School.
    9. Stephan, Paula E., 2010. "The Economics of Science," Handbook of the Economics of Innovation, in: Bronwyn H. Hall & Nathan Rosenberg (ed.), Handbook of the Economics of Innovation, edition 1, volume 1, chapter 0, pages 217-273, Elsevier.
    10. Wang, Jian, 2014. "Unpacking the Matthew effect in citations," Journal of Informetrics, Elsevier, vol. 8(2), pages 329-339.
    11. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    12. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    13. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    14. Gómez, Daniel & Figueira, José Rui & Eusébio, Augusto, 2013. "Modeling centrality measures in social network analysis using bi-criteria network flow optimization problems," European Journal of Operational Research, Elsevier, vol. 226(2), pages 354-365.
    15. Jian Wang & Bart Thijs & Wolfgang Glänzel, 2015. "Interdisciplinarity and Impact: Distinct Effects of Variety, Balance, and Disparity," PLOS ONE, Public Library of Science, vol. 10(5), pages 1-18, May.
    16. Anthony F. J. van Raan, 2004. "Sleeping Beauties in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 59(3), pages 467-472, March.
    17. Manuel Trajtenberg & Rebecca Henderson & Adam Jaffe, 1997. "University Versus Corporate Patents: A Window On The Basicness Of Invention," Economics of Innovation and New Technology, Taylor & Francis Journals, vol. 5(1), pages 19-50.
    18. Partha, Dasgupta & David, Paul A., 1994. "Toward a new economics of science," Research Policy, Elsevier, vol. 23(5), pages 487-521, September.
    19. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    20. Alfredo Yegros-Yegros & Ismael Rafols & Pablo D’Este, 2015. "Does Interdisciplinary Research Lead to Higher Citation Impact? The Different Effect of Proximal and Distal Interdisciplinarity," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-21, August.
    21. Jian Wang, 2013. "Citation time window choice for research impact evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 94(3), pages 851-872, March.
    22. Loet Leydesdorff, 2007. "Betweenness centrality as an indicator of the interdisciplinarity of scientific journals," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 58(9), pages 1303-1319, July.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Ruijie Wang & Yuhao Zhou & An Zeng, 2023. "Evaluating scientists by citation and disruption of their representative works," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1689-1710, March.
    2. Boris Forthmann & Mark A. Runco, 2020. "An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators," Publications, MDPI, vol. 8(2), pages 1-16, June.
    3. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    4. Narayanamurti, Venkatesh & Tsao, Jeffrey Y., 2024. "How technoscientific knowledge advances: A Bell-Labs-inspired architecture," Research Policy, Elsevier, vol. 53(4).
    5. Boris Forthmann & Mark Leveling & Yixiao Dong & Denis Dumas, 2020. "Investigating the quantity–quality relationship in scientific creativity: an empirical examination of expected residual variance and the tilted funnel hypothesis," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(3), pages 2497-2518, September.
    6. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Shiji Chen & Yanhui Song & Fei Shu & Vincent Larivière, 2022. "Interdisciplinarity and impact: the effects of the citation time window," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(5), pages 2621-2642, May.
    2. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    3. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    4. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    5. Wang, Jian, 2016. "Knowledge creation in collaboration networks: Effects of tie configuration," Research Policy, Elsevier, vol. 45(1), pages 68-80.
    6. Fontana, Magda & Iori, Martina & Montobbio, Fabio & Sinatra, Roberta, 2020. "New and atypical combinations: An assessment of novelty and interdisciplinarity," Research Policy, Elsevier, vol. 49(7).
    7. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    8. Boris Forthmann & Mark A. Runco, 2020. "An Empirical Test of the Inter-Relationships between Various Bibliometric Creative Scholarship Indicators," Publications, MDPI, vol. 8(2), pages 1-16, June.
    9. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    10. Jian Wang & Bart Thijs & Wolfgang Glänzel, 2015. "Interdisciplinarity and Impact: Distinct Effects of Variety, Balance, and Disparity," PLOS ONE, Public Library of Science, vol. 10(5), pages 1-18, May.
    11. Andrea Bonaccorsi & Nicola Melluso & Francesco Alessandro Massucci, 2022. "Exploring the antecedents of interdisciplinarity at the European Research Council: a topic modeling approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6961-6991, December.
    12. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    13. Kwon, Seokbeom, 2022. "Interdisciplinary knowledge integration as a unique knowledge source for technology development and the role of funding allocation," Technological Forecasting and Social Change, Elsevier, vol. 181(C).
    14. Chen, Jiyao & Shao, Diana & Fan, Shaokun, 2021. "Destabilization and consolidation: Conceptualizing, measuring, and validating the dual characteristics of technology," Research Policy, Elsevier, vol. 50(1).
    15. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    16. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Oct 2024.
    17. Veugelers, Reinhilde & Wang, Jian, 2019. "Scientific novelty and technological impact," Research Policy, Elsevier, vol. 48(6), pages 1362-1372.
    18. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    19. Fontana, Magda & Iori, Martina & Leone Sciabolazza, Valerio & Souza, Daniel, 2022. "The interdisciplinarity dilemma: Public versus private interests," Research Policy, Elsevier, vol. 51(7).
    20. Charles Ayoubi & Michele Pezzoni & Fabiana Visentin, 2021. "Does It Pay to Do Novel Science? The Selectivity Patterns in Science Funding," Science and Public Policy, Oxford University Press, vol. 48(5), pages 635-648.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:122:y:2020:i:1:d:10.1007_s11192-019-03263-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.