IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2511.01211.html

Novelty and Impact of Economics Papers

Author

Listed:
  • Chaofeng Wu

Abstract

We propose a framework that recasts scientific novelty not as a single attribute of a paper, but as a reflection of its position within the evolving intellectual landscape. We decompose this position into two orthogonal dimensions: \textit{spatial novelty}, which measures a paper's intellectual distinctiveness from its neighbors, and \textit{temporal novelty}, which captures its engagement with a dynamic research frontier. To operationalize these concepts, we leverage Large Language Models to develop semantic isolation metrics that quantify a paper's location relative to the full-text literature. Applying this framework to a large corpus of economics articles, we uncover a fundamental trade-off: these two dimensions predict systematically different outcomes. Temporal novelty primarily predicts citation counts, whereas spatial novelty predicts disruptive impact. This distinction allows us to construct a typology of semantic neighborhoods, identifying four archetypes associated with distinct and predictable impact profiles. Our findings demonstrate that novelty can be understood as a multidimensional construct whose different forms, reflecting a paper's strategic location, have measurable and fundamentally distinct consequences for scientific progress.

Suggested Citation

  • Chaofeng Wu, 2025. "Novelty and Impact of Economics Papers," Papers 2511.01211, arXiv.org, revised Nov 2025.
  • Handle: RePEc:arx:papers:2511.01211
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2511.01211
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Drew Fudenberg & Jon Kleinberg & Annie Liang & Sendhil Mullainathan, 2022. "Measuring the Completeness of Economic Models," Journal of Political Economy, University of Chicago Press, vol. 130(4), pages 956-990.
    2. Christian Leibel & Lutz Bornmann, 2024. "What do we know about the disruption index in scientometrics? An overview of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 601-639, January.
    3. repec:osf:socarx:2t46f_v1 is not listed on IDEAS
    4. Joshua Angrist & Pierre Azoulay & Glenn Ellison & Ryan Hill & Susan Feng Lu, 2017. "Economic Research Evolves: Fields and Styles," American Economic Review, American Economic Association, vol. 107(5), pages 293-297, May.
    5. Pierre Azoulay & Joshua S. Graff Zivin & Gustavo Manso, 2011. "Incentives and creativity: evidence from the academic life sciences," RAND Journal of Economics, RAND Corporation, vol. 42(3), pages 527-554, September.
    6. Vahe Tshitoyan & John Dagdelen & Leigh Weston & Alexander Dunn & Ziqin Rong & Olga Kononova & Kristin A. Persson & Gerbrand Ceder & Anubhav Jain, 2019. "Unsupervised word embeddings capture latent knowledge from materials science literature," Nature, Nature, vol. 571(7763), pages 95-98, July.
    7. Lee, You-Na & Walsh, John P. & Wang, Jian, 2015. "Creativity in scientific teams: Unpacking novelty and impact," Research Policy, Elsevier, vol. 44(3), pages 684-697.
    8. Russell J. Funk & Jason Owen-Smith, 2017. "A Dynamic Network Measure of Technological Change," Management Science, INFORMS, vol. 63(3), pages 791-817, March.
    9. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    10. Feng Shi & James Evans, 2023. "Surprising combinations of research contents and contexts are related to impact and emerge with scientific outsiders from distant disciplines," Nature Communications, Nature, vol. 14(1), pages 1-13, December.
    11. Foster, Jacob G. & Shi, Feng & Evans, James, 2021. "Surprise! Measuring Novelty as Expectation Violation," SocArXiv 2t46f, Center for Open Science.
    12. Luo, Zhuoran & Lu, Wei & He, Jiangen & Wang, Yuqi, 2022. "Combination of research questions and methods: A new measurement of scientific novelty," Journal of Informetrics, Elsevier, vol. 16(2).
    13. Trapido, Denis, 2015. "How novelty in knowledge earns recognition: The role of consistent identities," Research Policy, Elsevier, vol. 44(8), pages 1488-1500.
    14. Tahamtan, Iman & Bornmann, Lutz, 2018. "Creativity in science and the link to cited references: Is the creative potential of papers reflected in their cited references?," Journal of Informetrics, Elsevier, vol. 12(3), pages 906-930.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yi Zhao & Chengzhi Zhang, 2025. "A review on the novelty measurements of academic papers," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(2), pages 727-753, February.
    2. Baicun Li & Aruhan Bai, 2025. "The influence of grant renewal on research content: evidence from NIH-funded PIs," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(5), pages 2617-2638, May.
    3. Linming Xu & Baicun Li & Shuo Chen & Meijuan Li, 2025. "Research productivity and novelty under different funding models: evidence from NIH-funded research projects," Scientometrics, Springer;Akadémiai Kiadó, vol. 130(7), pages 3743-3771, July.
    4. Sam Arts & Nicola Melluso & Reinhilde Veugelers, 2023. "Beyond Citations: Measuring Novel Scientific Ideas and their Impact in Publication Text," Papers 2309.16437, arXiv.org, revised Dec 2024.
    5. Jeon, Daeseong & Lee, Junyoup & Ahn, Joon Mo & Lee, Changyong, 2023. "Measuring the novelty of scientific publications: A fastText and local outlier factor approach," Journal of Informetrics, Elsevier, vol. 17(4).
    6. Hou, Jianhua & Wang, Dongyi & Li, Jing, 2022. "A new method for measuring the originality of academic articles based on knowledge units in semantic networks," Journal of Informetrics, Elsevier, vol. 16(3).
    7. Wang, Zhongyi & Zhang, Haoxuan & Chen, Jiangping & Chen, Haihua, 2024. "An effective framework for measuring the novelty of scientific articles through integrated topic modeling and cloud model," Journal of Informetrics, Elsevier, vol. 18(4).
    8. Yang, Alex J., 2024. "Unveiling the impact and dual innovation of funded research," Journal of Informetrics, Elsevier, vol. 18(1).
    9. Sotaro Shibayama & Jian Wang, 2020. "Measuring originality in science," Scientometrics, Springer;Akadémiai Kiadó, vol. 122(1), pages 409-427, January.
    10. Lin, Yiling & Evans, James A. & Wu, Lingfei, 2022. "New directions in science emerge from disconnection and discord," Journal of Informetrics, Elsevier, vol. 16(1).
    11. Deyun Yin & Zhao Wu & Kazuki Yokota & Kuniko Matsumoto & Sotaro Shibayama, 2023. "Identify novel elements of knowledge with word embedding," PLOS ONE, Public Library of Science, vol. 18(6), pages 1-16, June.
    12. Yang, Alex J., 2025. "Text vs. citations: A comparative analysis of breakthrough and disruption metrics in patent innovation," Research Policy, Elsevier, vol. 54(8).
    13. Yuefen Wang & Lipeng Fan & Lei Wu, 2024. "A validation test of the Uzzi et al. novelty measure of innovation and applications to collaboration patterns between institutions," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(7), pages 4379-4394, July.
    14. Zhaoping Yan & Kaiyu Fan, 2024. "An integrated indicator for evaluating scientific papers: considering academic impact and novelty," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(11), pages 6909-6929, November.
    15. Sotaro Shibayama & Deyun Yin & Kuniko Matsumoto, 2021. "Measuring novelty in science with word embedding," PLOS ONE, Public Library of Science, vol. 16(7), pages 1-16, July.
    16. Bornmann, Lutz & Tekles, Alexander, 2021. "Convergent validity of several indicators measuring disruptiveness with milestone assignments to physics papers by experts," Journal of Informetrics, Elsevier, vol. 15(3).
    17. Christian Leibel & Lutz Bornmann, 2024. "What do we know about the disruption index in scientometrics? An overview of the literature," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(1), pages 601-639, January.
    18. Shiji Chen & Yanan Guo & Alvin Shijie Ding & Yanhui Song, 2024. "Is interdisciplinarity more likely to produce novel or disruptive research?," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(5), pages 2615-2632, May.
    19. Yang, Alex J. & Zhang, Yiqin & Wang, Zuorong & Wang, Hao & Deng, Sanhong, 2025. "Quantifying delayed recognition of scientists," Journal of Informetrics, Elsevier, vol. 19(3).
    20. Nicolas Carayol, 2016. "The Right Job and the Job Right: Novelty, Impact and Journal Stratification in Science," Post-Print hal-02274661, HAL.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2511.01211. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.