IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2312.16099.html

Direct Multi-Step Forecast based Comparison of Nested Models via an Encompassing Test

Author

Listed:
  • Jean-Yves Pitarakis

Abstract

We introduce a novel approach for comparing out-of-sample multi-step forecasts obtained from a pair of nested models that is based on the forecast encompassing principle. Our proposed approach relies on an alternative way of testing the population moment restriction implied by the forecast encompassing principle and that links the forecast errors from the two competing models in a particular way. Its key advantage is that it is able to bypass the variance degeneracy problem afflicting model based forecast comparisons across nested models. It results in a test statistic whose limiting distribution is standard normal and which is particularly simple to construct and can accommodate both single period and longer-horizon prediction comparisons. Inferences are also shown to be robust to different predictor types, including stationary, highly-persistent and purely deterministic processes. Finally, we illustrate the use of our proposed approach through an empirical application that explores the role of global inflation in enhancing individual country specific inflation forecasts.

Suggested Citation

  • Jean-Yves Pitarakis, 2023. "Direct Multi-Step Forecast based Comparison of Nested Models via an Encompassing Test," Papers 2312.16099, arXiv.org.
  • Handle: RePEc:arx:papers:2312.16099
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2312.16099
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. West, Kenneth D, 1996. "Asymptotic Inference about Predictive Ability," Econometrica, Econometric Society, vol. 64(5), pages 1067-1084, September.
    2. Pitarakis, Jean-Yves, 2025. "A Novel Approach To Predictive Accuracy Testing In Nested Environments," Econometric Theory, Cambridge University Press, vol. 41(1), pages 35-78, February.
    3. Jörg Breitung & Malte Knüppel, 2021. "How far can we forecast? Statistical tests of the predictive content," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 36(4), pages 369-392, June.
    4. McCracken, Michael W., 2007. "Asymptotics for out of sample tests of Granger causality," Journal of Econometrics, Elsevier, vol. 140(2), pages 719-752, October.
    5. Giraitis, Liudas & Kokoszka, Piotr & Leipus, Remigijus, 2000. "Stationary Arch Models: Dependence Structure And Central Limit Theorem," Econometric Theory, Cambridge University Press, vol. 16(1), pages 3-22, February.
    6. Clark, Todd E. & West, Kenneth D., 2007. "Approximately normal tests for equal predictive accuracy in nested models," Journal of Econometrics, Elsevier, vol. 138(1), pages 291-311, May.
    7. Hendry, David F. & Richard, Jean-Francois, 1982. "On the formulation of empirical models in dynamic econometrics," Journal of Econometrics, Elsevier, vol. 20(1), pages 3-33, October.
    8. José Luis Montiel Olea & Mikkel Plagborg‐Møller, 2021. "Local Projection Inference Is Simpler and More Robust Than You Think," Econometrica, Econometric Society, vol. 89(4), pages 1789-1823, July.
    9. Geoffrey Decrouez & Peter Hall, 2014. "Split sample methods for constructing confidence intervals for binomial and Poisson parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(5), pages 949-975, November.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Alessandro Morico & Ovidijus Stauskas, 2025. "Robust Tests for Factor-Augmented Regressions with an Application to the novel EA-MD-QD Dataset," Papers 2504.08455, arXiv.org, revised Nov 2025.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Gonzalo, Jesús & Pitarakis, Jean-Yves, 2024. "Out-of-sample predictability in predictive regressions with many predictor candidates," International Journal of Forecasting, Elsevier, vol. 40(3), pages 1166-1178.
    2. Pitarakis, Jean-Yves, 2025. "A Novel Approach To Predictive Accuracy Testing In Nested Environments," Econometric Theory, Cambridge University Press, vol. 41(1), pages 35-78, February.
    3. Corradi, Valentina & Fosten, Jack & Gutknecht, Daniel, 2024. "Predictive ability tests with possibly overlapping models," Journal of Econometrics, Elsevier, vol. 241(1).
    4. Rossi, Barbara, 2013. "Advances in Forecasting under Instability," Handbook of Economic Forecasting, in: G. Elliott & C. Granger & A. Timmermann (ed.), Handbook of Economic Forecasting, edition 1, volume 2, chapter 0, pages 1203-1324, Elsevier.
    5. Kenneth S. Rogoff & Vania Stavrakeva, 2008. "The Continuing Puzzle of Short Horizon Exchange Rate Forecasting," NBER Working Papers 14071, National Bureau of Economic Research, Inc.
    6. Christopher J. Neely & David E. Rapach & Jun Tu & Guofu Zhou, 2014. "Forecasting the Equity Risk Premium: The Role of Technical Indicators," Management Science, INFORMS, vol. 60(7), pages 1772-1791, July.
    7. Richard A. Ashley & Kwok Ping Tsang, 2014. "Credible Granger-Causality Inference with Modest Sample Lengths: A Cross-Sample Validation Approach," Econometrics, MDPI, vol. 2(1), pages 1-20, March.
    8. Pablo Pincheira B., 2007. "Hidden Predictability in Economics: The Case of the Chilean Exchange Rate," Working Papers Central Bank of Chile 435, Central Bank of Chile.
    9. Kirstin Hubrich & Kenneth D. West, 2010. "Forecast evaluation of small nested model sets," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 25(4), pages 574-594.
    10. Ma, Feng & Liu, Jing & Wahab, M.I.M. & Zhang, Yaojie, 2018. "Forecasting the aggregate oil price volatility in a data-rich environment," Economic Modelling, Elsevier, vol. 72(C), pages 320-332.
    11. Pablo Pincheira & Jorge Selaive, 2011. "External imbalance, valuation adjustments and real Exchange rate: evidence of predictability in an emerging economy," Revista de Analisis Economico – Economic Analysis Review, Universidad Alberto Hurtado/School of Economics and Business, vol. 26(1), pages 107-125, Junio.
    12. Clark, Todd & McCracken, Michael, 2013. "Advances in Forecast Evaluation," Handbook of Economic Forecasting, in: G. Elliott & C. Granger & A. Timmermann (ed.), Handbook of Economic Forecasting, edition 1, volume 2, chapter 0, pages 1107-1201, Elsevier.
    13. Li, Yan & Ng, David T. & Swaminathan, Bhaskaran, 2013. "Predicting market returns using aggregate implied cost of capital," Journal of Financial Economics, Elsevier, vol. 110(2), pages 419-436.
    14. Brooks, Chris & Burke, Simon P. & Stanescu, Silvia, 2016. "Finite sample weighting of recursive forecast errors," International Journal of Forecasting, Elsevier, vol. 32(2), pages 458-474.
    15. repec:lan:wpaper:2364 is not listed on IDEAS
    16. Stanislav Anatolyev, 2007. "Inference about predictive ability when there are many predictors," Working Papers w0096, New Economic School (NES).
    17. Pablo Pincheira Brown & Nicolás Hardy, 2024. "Correlation‐based tests of predictability," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 43(6), pages 1835-1858, September.
    18. Richard Ashley & Haichun Ye, 2012. "On the Granger causality between median inflation and price dispersion," Applied Economics, Taylor & Francis Journals, vol. 44(32), pages 4221-4238, November.
    19. Ferrara, Laurent & Marcellino, Massimiliano & Mogliani, Matteo, 2015. "Macroeconomic forecasting during the Great Recession: The return of non-linearity?," International Journal of Forecasting, Elsevier, vol. 31(3), pages 664-679.
    20. Raffaella Giacomini & Barbara Rossi, 2013. "Forecasting in macroeconomics," Chapters, in: Nigar Hashimzade & Michael A. Thornton (ed.), Handbook of Research Methods and Applications in Empirical Macroeconomics, chapter 17, pages 381-408, Edward Elgar Publishing.
    21. Busetti, Fabio & Marcucci, Juri, 2013. "Comparing forecast accuracy: A Monte Carlo investigation," International Journal of Forecasting, Elsevier, vol. 29(1), pages 13-27.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2312.16099. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.