IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2603.20319.html

Implementation Risk in Portfolio Backtesting: A Previously Unquantified Source of Error

Author

Listed:
  • Dong Yin
  • Takeshi Miki
  • Vladislav Lesnichenko
  • Vasyl Gural

Abstract

Portfolio backtesting is the primary tool for evaluating investment strategies before deployment, yet practitioners implicitly assume that different engines produce identical results for the same strategy. we formalise implementation risk, the systematic divergence in backtested portfolio metrics arising solely from differences in how engines implement the same logical strategy, and propose four metrics grounded in metrology to quantify it: engine sensitivity, implementation uncertainty interval, divergence amplification factor, and conclusion stability index. we execute 15 benchmark strategies through five independent open-source engines on 30 non-overlapping stratified asset buckets comprising 180 s&p 500 stocks under four transaction-cost regimes. at zero cost, all five engines agree exactly (maximum divergence 0.000%), isolating transaction-cost implementation as the sole source of disagreement. under nonzero costs, divergence is structured and predictable (spearman rho = 0.93 with cost intensity), remaining below 0.75 percentage points for most strategies but reaching 3.71% for high-turnover rotation strategies. source-code forensics uncovered seven previously undocumented defects across three engines, abstracted into a five-category failure-mode taxonomy. all engines agree on the sign of every performance metric (conclusion stability index = 1), so implementation risk does not alter investment decisions for the strategies studied but introduces measurable ambiguity in performance attribution. code and benchmark data are publicly available.

Suggested Citation

  • Dong Yin & Takeshi Miki & Vladislav Lesnichenko & Vasyl Gural, 2026. "Implementation Risk in Portfolio Backtesting: A Previously Unquantified Source of Error," Papers 2603.20319, arXiv.org.
  • Handle: RePEc:arx:papers:2603.20319
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2603.20319
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Christophe Pérignon & Olivier Akmansoy & Christophe Hurlin & Anna Dreber & Felix Holzmeister & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Albert J Menkveld & Michael Razen & Utz Weitzel, 2024. "Computational Reproducibility in Finance: Evidence from 1,000 Tests," The Review of Financial Studies, Society for Financial Studies, vol. 37(11), pages 3558-3593.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ferman, Bruno & Finamor, Lucas, 2025. "There must be an error here! Experimental evidence on coding errors' biases," I4R Discussion Paper Series 266, The Institute for Replication (I4R).
    2. Hasso, Tim & Brosnan, Mark & Ali, Searat & Chai, Daniel, 2025. "Perceived problems, causes, and solutions of finance research reproducibility and replicability: A pre-registered report," Pacific-Basin Finance Journal, Elsevier, vol. 91(C).
    3. Tom L. Dudda & Lars Hornuf, 2025. "The Perks and Perils of Machine Learning in Business and Economic Research," CESifo Working Paper Series 11721, CESifo.
    4. Julian Junyan Wang & Victor Xiaoqi Wang, 2025. "Assessing Consistency and Reproducibility in the Outputs of Large Language Models: Evidence Across Diverse Finance and Accounting Tasks," Papers 2503.16974, arXiv.org, revised Sep 2025.
    5. Balafoutas, Loukas & Celse, Jeremy & Karakostas, Alexandros & Umashev, Nicholas, 2025. "Incentives and the replication crisis in social sciences: A critical review of open science practices," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 114(C).

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2603.20319. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.