IDEAS home Printed from https://ideas.repec.org/a/eee/infome/v13y2019i4s1751157718304371.html
   My bibliography  Save this article

Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data

Author

Listed:
  • Bornmann, Lutz
  • Tekles, Alexander
  • Zhang, Helena H.
  • Ye, Fred Y.

Abstract

Lee et al. (2015) – based on Uzzi et al. (2013) – and Wang et al. (2017) proposed scores based on cited references (cited journals) data which can be used to measure the novelty of papers (named as novelty scores U and W in this study). Although previous research has used novelty scores in various empirical analyses, no study has been published up to now – to the best of our knowledge – which quantitatively tested the convergent validity of novelty scores: do these scores measure what they propose to measure? Using novelty assessments by faculty members (FMs) at F1000Prime for comparison, we tested the convergent validity of the two novelty scores (U and W). FMs’ assessments do not only refer to the quality of biomedical papers, but also to their characteristics (by assigning certain tags to the papers): for example, are the presented findings or formulated hypotheses novel (tags “new findings” and “hypothesis”)? We used these and other tags to investigate the convergent validity of both novelty scores. Our study reveals different results for the novelty scores: the results for novelty score U are mostly in agreement with previously formulated expectations. We found, for instance, that for a standard deviation (one unit) increase in novelty score U, the expected number of assignments of the “new finding” tag increase by 7.47%. The results for novelty score W, however, do not reflect convergent validity with the FMs’ assessments: only the results for some tags are in agreement with the expectations. Thus, we propose – based on our results – the use of novelty score U for measuring novelty quantitatively, but question the use of novelty score W.

Suggested Citation

  • Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
  • Handle: RePEc:eee:infome:v:13:y:2019:i:4:s1751157718304371
    DOI: 10.1016/j.joi.2019.100979
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S1751157718304371
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.joi.2019.100979?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Mikko Packalen & Jay Bhattacharya, 2017. "Neophilia ranking of scientific journals," Scientometrics, Springer;Akadémiai Kiadó, vol. 110(1), pages 43-64, January.
    2. Uddin, Shahadat & Khan, Arif, 2016. "The impact of author-selected keywords on citation counts," Journal of Informetrics, Elsevier, vol. 10(4), pages 1166-1177.
    3. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    4. Sarah Kaplan & Keyvan Vakili, 2015. "The double-edged sword of recombination in breakthrough innovation," Strategic Management Journal, Wiley Blackwell, vol. 36(10), pages 1435-1457, October.
    5. Lutz Bornmann & Julian N. Marewski, 2019. "Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation," Scientometrics, Springer;Akadémiai Kiadó, vol. 120(2), pages 419-459, August.
    6. Martin, Ben R. & Irvine, John, 1993. "Assessing basic research : Some partial indicators of scientific progress in radio astronomy," Research Policy, Elsevier, vol. 22(2), pages 106-106, April.
    7. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    8. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    9. Lee Fleming, 2001. "Recombinant Uncertainty in Technological Search," Management Science, INFORMS, vol. 47(1), pages 117-132, January.
    10. Jesper W. Schneider & Rodrigo Costas, 2017. "Identifying potential “breakthrough” publications using refined citation analyses: Three related explorative approaches," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(3), pages 709-723, March.
    11. Verhoeven, Dennis & Bakker, Jurriën & Veugelers, Reinhilde, 2016. "Measuring technological novelty with patent-based indicators," Research Policy, Elsevier, vol. 45(3), pages 707-723.
    12. Bornmann, Lutz & Leydesdorff, Loet, 2013. "The validation of (advanced) bibliometric indicators through peer assessments: A comparative study using data from InCites and F1000," Journal of Informetrics, Elsevier, vol. 7(2), pages 286-291.
    13. Jinseok Kim & Jana Diesner, 2015. "Coauthorship networks: A directed network approach considering the order and number of coauthors," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2685-2696, December.
    14. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    15. Wang, Jian & Lee, You-Na & Walsh, John P., 2018. "Funding model and creativity in science: Competitive versus block funding and status contingency effects," Research Policy, Elsevier, vol. 47(6), pages 1070-1083.
    16. Ludo Waltman & Rodrigo Costas, 2014. "F1000 Recommendations as a Potential New Data Source for Research Evaluation: A Comparison With Citations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 65(3), pages 433-445, March.
    17. Jacques Mairesse & Michele Pezzoni, 2018. "Novelty in Science: The Impact of French Physicists' Novel Articles," GREDEG Working Papers 2018-23, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    18. Salil Gunashekar & Steven Wooding & Susan Guthrie, 2017. "How do NIHR peer review panels use bibliometric information to support their decisions?," Scientometrics, Springer;Akadémiai Kiadó, vol. 112(3), pages 1813-1835, September.
    19. Ponomarev, Ilya V. & Williams, Duane E. & Hackett, Charles J. & Schnell, Joshua D. & Haak, Laurel L., 2014. "Predicting highly cited papers: A Method for Early Detection of Candidate Breakthroughs," Technological Forecasting and Social Change, Elsevier, vol. 81(C), pages 49-55.
    20. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    21. Jian Du & Xiaoli Tang & Yishan Wu, 2016. "The effects of research level and article type on the differences between citation metrics and F1000 recommendations," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 67(12), pages 3008-3021, December.
    22. Ehsan Mohammadi & Mike Thelwall, 2013. "Assessing non-standard article impact using F1000 labels," Scientometrics, Springer;Akadémiai Kiadó, vol. 97(2), pages 383-395, November.
    23. Lutz Bornmann & Alexander Tekles & Loet Leydesdorff, 2019. "How well does I3 perform for impact measurement compared to other bibliometric indicators? The convergent validity of several (field-normalized) indicators," Scientometrics, Springer;Akadémiai Kiadó, vol. 119(2), pages 1187-1205, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Kuniko Matsumoto & Sotaro Shibayama & Byeongwoo Kang & Masatsura Igami, 2021. "Introducing a novelty indicator for scientific research: validating the knowledge-based combinatorial approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6891-6915, August.
    2. Erin Leahey & Jina Lee & Russell J. Funk, 2023. "What Types of Novelty Are Most Disruptive?," American Sociological Review, , vol. 88(3), pages 562-597, June.
    3. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    4. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    5. Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
    6. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    7. Yulin Yu & Daniel M. Romero, 2024. "Does the Use of Unusual Combinations of Datasets Contribute to Greater Scientific Impact?," Papers 2402.05024, arXiv.org, revised Feb 2024.
    8. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    9. Helena H. Zhang & Fred Y. Ye, 2020. "Identifying ‘associated-sleeping-beauties’ in ‘swan-groups’ based on small qualified datasets of physics and economics," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(3), pages 1525-1537, March.
    10. Wenjie Wei & Hongxu Liu & Zhuanlan Sun, 2022. "Cover papers of top journals are reliable source for emerging topics detection: a machine learning based prediction framework," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(8), pages 4315-4333, August.
    11. Andrea Bonaccorsi & Nicola Melluso & Francesco Alessandro Massucci, 2022. "Exploring the antecedents of interdisciplinarity at the European Research Council: a topic modeling approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6961-6991, December.
    12. Elizabeth S. Vieira, 2023. "The influence of research collaboration on citation impact: the countries in the European Innovation Scoreboard," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3555-3579, June.
    13. Sandra Rousseau & Ronald Rousseau, 2021. "Bibliometric Techniques And Their Use In Business And Economics Research," Journal of Economic Surveys, Wiley Blackwell, vol. 35(5), pages 1428-1451, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Kuniko Matsumoto & Sotaro Shibayama & Byeongwoo Kang & Masatsura Igami, 2021. "Introducing a novelty indicator for scientific research: validating the knowledge-based combinatorial approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6891-6915, August.
    2. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    3. Yan Yan & Shanwu Tian & Jingjing Zhang, 2020. "The impact of a paper’s new combinations and new components on its citation," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(2), pages 895-913, February.
    4. Fontana, Magda & Iori, Martina & Montobbio, Fabio & Sinatra, Roberta, 2020. "New and atypical combinations: An assessment of novelty and interdisciplinarity," Research Policy, Elsevier, vol. 49(7).
    5. Veugelers, Reinhilde & Wang, Jian, 2019. "Scientific novelty and technological impact," Research Policy, Elsevier, vol. 48(6), pages 1362-1372.
    6. Dongqing Lyu & Kaile Gong & Xuanmin Ruan & Ying Cheng & Jiang Li, 2021. "Does research collaboration influence the “disruption” of articles? Evidence from neurosciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(1), pages 287-303, January.
    7. Banal-Estañol, Albert & Macho-Stadler, Inés & Pérez-Castrillo, David, 2019. "Evaluation in research funding agencies: Are structurally diverse teams biased against?," Research Policy, Elsevier, vol. 48(7), pages 1823-1840.
    8. Ke, Qing, 2020. "Technological impact of biomedical research: The role of basicness and novelty," Research Policy, Elsevier, vol. 49(7).
    9. Lutz Bornmann, 2015. "Interrater reliability and convergent validity of F1000Prime peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 66(12), pages 2415-2426, December.
    10. Albert Banal-Estañol & Ines Macho-Stadler & David Pérez-Castrillo, 2016. "Key Success Drivers in Public Research Grants: Funding the Seeds of Radical Innovation in Academia?," CESifo Working Paper Series 5852, CESifo.
    11. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.
    12. Ron Boschma & Ernest Miguelez & Rosina Moreno & Diego B. Ocampo-Corrales, 2021. "Technological breakthroughs in European regions: the role of related and unrelated combinations," Papers in Evolutionary Economic Geography (PEEG) 2118, Utrecht University, Department of Human Geography and Spatial Planning, Group Economic Geography, revised Jun 2021.
    13. Deichmann, Dirk & Moser, Christine & Birkholz, Julie M. & Nerghes, Adina & Groenewegen, Peter & Wang, Shenghui, 2020. "Ideas with impact: How connectivity shapes idea diffusion," Research Policy, Elsevier, vol. 49(1).
    14. Pierre Pelletier & Kevin Wirtz, 2023. "Sails and Anchors: The Complementarity of Exploratory and Exploitative Scientists in Knowledge Creation," Papers 2312.10476, arXiv.org.
    15. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    16. Zhu, Kejia & Malhotra, Shavin & Li, Yaohan, 2022. "Technological diversity of patent applications and decision pendency," Research Policy, Elsevier, vol. 51(1).
    17. Pezzoni, Michele & Veugelers, Reinhilde & Visentin, Fabiana, 2022. "How fast is this novel technology going to be a hit? Antecedents predicting follow-on inventions," Research Policy, Elsevier, vol. 51(3).
    18. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    19. Mojisola Erdt & Aarthy Nagarajan & Sei-Ching Joanna Sin & Yin-Leng Theng, 2016. "Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(2), pages 1117-1166, November.
    20. Andrea Bonaccorsi & Nicola Melluso & Francesco Alessandro Massucci, 2022. "Exploring the antecedents of interdisciplinarity at the European Research Council: a topic modeling approach," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(12), pages 6961-6991, December.

    More about this item

    Keywords

    Bibliometrics; Novelty; Creativity; Cited references; F1000Prime;
    All these keywords.

    JEL classification:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:infome:v:13:y:2019:i:4:s1751157718304371. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/joi .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.