IDEAS home Printed from https://ideas.repec.org/a/bla/jinfst/v74y2023i1p99-114.html
   My bibliography  Save this article

Bias against scientific novelty: A prepublication perspective

Author

Listed:
  • Zhentao Liang
  • Jin Mao
  • Gang Li

Abstract

Novel ideas often experience resistance from incumbent forces. While evidence of the bias against novelty has been widely identified in science, there is still a lack of large‐scale quantitative work to study this problem occurring in the prepublication process of manuscripts. This paper examines the association between manuscript novelty and handling time of publication based on 778,345 articles in 1,159 journals indexed by PubMed. Measuring the novelty as the extent to which manuscripts disrupt existing knowledge, we found systematic evidence that higher novelty is associated with longer handling time. Matching and fixed‐effect models were adopted to confirm the statistical significance of this pattern. Moreover, submissions from prestigious authors and institutions have the advantage of shorter handling time, but this advantage is diminishing as manuscript novelty increases. In addition, we found longer handling time is negatively related to the impact of manuscripts, while the relationships between novelty and 3‐ and 5‐year citations are U‐shape. This study expands the existing knowledge of the novelty bias by examining its existence in the prepublication process of manuscripts.

Suggested Citation

  • Zhentao Liang & Jin Mao & Gang Li, 2023. "Bias against scientific novelty: A prepublication perspective," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(1), pages 99-114, January.
  • Handle: RePEc:bla:jinfst:v:74:y:2023:i:1:p:99-114
    DOI: 10.1002/asi.24725
    as

    Download full text from publisher

    File URL: https://doi.org/10.1002/asi.24725
    Download Restriction: no

    File URL: https://libkey.io/10.1002/asi.24725?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Mengyi Sun & Jainabou Barry Danfa & Misha Teplitskiy, 2021. "Does double-blind peer-review reduce bias? Evidence from a top computer science conference," Papers 2101.02701, arXiv.org.
    2. Lingfei Wu & Dashun Wang & James A. Evans, 2019. "Large teams develop and small teams disrupt science and technology," Nature, Nature, vol. 566(7744), pages 378-382, February.
    3. Alfredo Yegros Yegros & Carlos B. Amat, 2009. "Editorial delay of food research papers is influenced by authors’ experience but not by country of origin of the manuscripts," Scientometrics, Springer;Akadémiai Kiadó, vol. 81(2), pages 367-380, November.
    4. Zhenquan Lin & Shanci Hou & Jinshan Wu, 2016. "The correlation between editorial delay and the ratio of highly cited papers in Nature, Science and Physical Review Letters," Scientometrics, Springer;Akadémiai Kiadó, vol. 107(3), pages 1457-1464, June.
    5. Lutz Bornmann & Hans-Dieter Daniel, 2009. "Reviewer and editor biases in journal peer review: an investigation of manuscript refereeing at Angewandte Chemie International Edition," Research Evaluation, Oxford University Press, vol. 18(4), pages 262-272, October.
    6. Chai, Sen & Menon, Anoop, 2019. "Breakthrough recognition: Bias against novelty and competition for attention," Research Policy, Elsevier, vol. 48(3), pages 733-747.
    7. Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436.
    8. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    9. Si Shen & Ronald Rousseau & Dongbo Wang & Danhao Zhu & Huoyu Liu & Ruilun Liu, 2015. "Editorial delay and its relation to subsequent citations: the journals Nature, Science and Cell," Scientometrics, Springer;Akadémiai Kiadó, vol. 105(3), pages 1867-1873, December.
    10. Veugelers, Reinhilde & Wang, Jian, 2019. "Scientific novelty and technological impact," Research Policy, Elsevier, vol. 48(6), pages 1362-1372.
    11. Bornmann, Lutz & Tekles, Alexander & Zhang, Helena H. & Ye, Fred Y., 2019. "Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000Prime data," Journal of Informetrics, Elsevier, vol. 13(4).
    12. Kevin J. Boudreau & Eva C. Guinan & Karim R. Lakhani & Christoph Riedl, 2016. "Looking Across and Looking Beyond the Knowledge Frontier: Intellectual Distance, Novelty, and Resource Allocation in Science," Management Science, INFORMS, vol. 62(10), pages 2765-2783, October.
    13. Dondio, Pierpaolo & Casnici, Niccolò & Grimaldo, Francisco & Gilbert, Nigel & Squazzoni, Flaminio, 2019. "The “invisible hand” of peer review: The implications of author-referee networks on peer review in a scholarly journal," Journal of Informetrics, Elsevier, vol. 13(2), pages 708-716.
    14. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    15. Lindell Bromham & Russell Dinnage & Xia Hua, 2016. "Interdisciplinary research has consistently lower funding success," Nature, Nature, vol. 534(7609), pages 684-687, June.
    16. Iacus, Stefano M. & King, Gary & Porro, Giuseppe, 2012. "Causal Inference without Balance Checking: Coarsened Exact Matching," Political Analysis, Cambridge University Press, vol. 20(1), pages 1-24, January.
    17. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    18. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    19. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    20. Zhentao Liang & Jin Mao & Kun Lu & Gang Li, 2021. "Finding citations for PubMed: a large-scale comparison between five freely available bibliographic data sources," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(12), pages 9519-9542, December.
    21. Lutz Bornmann & Hans-Dieter Daniel, 2006. "Potential sources of bias in research fellowship assessments: effects of university prestige and field of study," Research Evaluation, Oxford University Press, vol. 15(3), pages 209-219, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    2. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Nov 2023.
    3. Libo Sheng & Dongqing Lyu & Xuanmin Ruan & Hongquan Shen & Ying Cheng, 2023. "The association between prior knowledge and the disruption of an article," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4731-4751, August.
    4. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    5. Seolmin Yang & So Young Kim, 2023. "Knowledge-integrated research is more disruptive when supported by homogeneous funding sources: a case of US federally funded research in biomedical and life sciences," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(6), pages 3257-3282, June.
    6. Guoqiang Liang & Ying Lou & Haiyan Hou, 2022. "Revisiting the disruptive index: evidence from the Nobel Prize-winning articles," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(10), pages 5721-5730, October.
    7. Jürgen Janger & Nicole Schmidt & Anna Strauss, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, Juni.
    8. Rodríguez Sánchez, Isabel & Makkonen, Teemu & Williams, Allan M., 2019. "Peer review assessment of originality in tourism journals: critical perspective of key gatekeepers," Annals of Tourism Research, Elsevier, vol. 77(C), pages 1-11.
    9. Ke, Qing, 2020. "Technological impact of biomedical research: The role of basicness and novelty," Research Policy, Elsevier, vol. 49(7).
    10. Yue Wang & Ning Li & Bin Zhang & Qian Huang & Jian Wu & Yang Wang, 2023. "The effect of structural holes on producing novel and disruptive research in physics," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(3), pages 1801-1823, March.
    11. Stephen A Gallo & Joanne H Sullivan & Scott R Glisson, 2016. "The Influence of Peer Reviewer Expertise on the Evaluation of Research Funding Applications," PLOS ONE, Public Library of Science, vol. 11(10), pages 1-18, October.
    12. Shiyun Wang & Yaxue Ma & Jin Mao & Yun Bai & Zhentao Liang & Gang Li, 2023. "Quantifying scientific breakthroughs by a novel disruption indicator based on knowledge entities," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 74(2), pages 150-167, February.
    13. Yuyan Jiang & Xueli Liu, 2023. "A construction and empirical research of the journal disruption index based on open citation data," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(7), pages 3935-3958, July.
    14. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    15. Lu Liu & Benjamin F. Jones & Brian Uzzi & Dashun Wang, 2023. "Data, measurement and empirical methods in the science of science," Nature Human Behaviour, Nature, vol. 7(7), pages 1046-1058, July.
    16. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    17. Kwon, Seokbeom, 2022. "Interdisciplinary knowledge integration as a unique knowledge source for technology development and the role of funding allocation," Technological Forecasting and Social Change, Elsevier, vol. 181(C).
    18. Peter Sjögårde & Fereshteh Didegah, 2022. "The association between topic growth and citation impact of research publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 127(4), pages 1903-1921, April.
    19. Wu, Lingfei & Kittur, Aniket & Youn, Hyejin & Milojević, Staša & Leahey, Erin & Fiore, Stephen M. & Ahn, Yong-Yeol, 2022. "Metrics and mechanisms: Measuring the unmeasurable in the science of science," Journal of Informetrics, Elsevier, vol. 16(2).
    20. Leydesdorff, Loet & Bornmann, Lutz, 2021. "Disruption indices and their calculation using web-of-science data: Indicators of historical developments or evolutionary dynamics?," Journal of Informetrics, Elsevier, vol. 15(4).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jinfst:v:74:y:2023:i:1:p:99-114. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www.asis.org .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.