IDEAS home Printed from https://ideas.repec.org/p/not/notgts/17-04.html
   My bibliography  Save this paper

A bootstrap stationarity test for predictive regression invalidity

Author

Listed:
  • Iliyan Georgiev
  • David I. Harvey
  • Stephen J. Leybourne
  • A. M. Robert Taylor

Abstract

In order for predictive regression tests to delivery asymptotically valid inference, account has to be taken of the degree of persistence of the predictors under test. There is also a maintained assumption that the predictability of the variable of interest is purely attributable to the predictors under test. Violation of this assumption by the omission of relevant persistent predictors renders the predictive regression invalid with the result that both the finite sample and asymptotic size of the predictability tests can be significantly inflated, with the potential therefore to spuriously indicate predictability. In response we propose a predictive regression invalidity test based on a stationarity testing approach. To allow for an unknown degree of persistence in the putative predictors, and for heteroskedasticity in the data, we implement our proposed test using a fixed regressor wild bootstrap procedure. We demonstrate the asymptotic distribution of the bootstrap statistic, conditional on the data, is the same (to first-order) as the asymptotic null distribution of the statistic computed on the original data, conditional on the predictor. This corrects a long-standing error in the bootstrap literature whereby it is incorrectly argued that for strongly persistent regressors the validity of the fixed aggressor bootstrap obtains through equivalence to an unconditional limit distribution. Our bootstrap results are therefore of interest in their own right and are likely to have important applications beyond the present context. An illustration is given by re-examining the results relating to US stock return data in Campbell and Yogo (2006).

Suggested Citation

  • Iliyan Georgiev & David I. Harvey & Stephen J. Leybourne & A. M. Robert Taylor, 2017. "A bootstrap stationarity test for predictive regression invalidity," Discussion Papers 17/04, University of Nottingham, Granger Centre for Time Series Econometrics.
  • Handle: RePEc:not:notgts:17/04
    as

    Download full text from publisher

    File URL: https://www.nottingham.ac.uk/research/groups/grangercentre/documents/17-04.pdf
    Download Restriction: no
    ---><---

    Other versions of this item:

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Giuseppe Cavaliere & Iliyan Georgiev, 2020. "Inference Under Random Limit Bootstrap Measures," Econometrica, Econometric Society, vol. 88(6), pages 2547-2574, November.
    2. Yang, Bingduo & Long, Wei & Yang, Zihui, 2022. "Testing predictability of stock returns under possible bubbles," Journal of Empirical Finance, Elsevier, vol. 68(C), pages 246-260.
    3. Xiaohui Liu & Yuzi Liu & Yao Rao & Fucai Lu, 2021. "A Unified test for the Intercept of a Predictive Regression Model," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 83(2), pages 571-588, April.
    4. Christis Katsouris, 2023. "Predictability Tests Robust against Parameter Instability," Papers 2307.15151, arXiv.org.
    5. Demetrescu, Matei & Rodrigues, Paulo M.M., 2022. "Residual-augmented IVX predictive regression," Journal of Econometrics, Elsevier, vol. 227(2), pages 429-460.
    6. Georgiev, Iliyan & Harvey, David I. & Leybourne, Stephen J. & Taylor, A.M. Robert, 2018. "Testing for parameter instability in predictive regression models," Journal of Econometrics, Elsevier, vol. 204(1), pages 101-118.
    7. Demetrescu, Matei & Rodrigues, Paulo M.M. & Taylor, A.M. Robert, 2023. "Transformed regression-based long-horizon predictability tests," Journal of Econometrics, Elsevier, vol. 237(2).
    8. Demetrescu, Matei & Georgiev, Iliyan & Rodrigues, Paulo M.M. & Taylor, A.M. Robert, 2022. "Testing for episodic predictability in stock returns," Journal of Econometrics, Elsevier, vol. 227(1), pages 85-113.
    9. Fukang Zhu & Mengya Liu & Shiqing Ling & Zongwu Cai, 2020. "Testing for Structural Change of Predictive Regression Model to Threshold Predictive Regression Model," WORKING PAPERS SERIES IN THEORETICAL AND APPLIED ECONOMICS 202021, University of Kansas, Department of Economics, revised Dec 2020.

    More about this item

    Keywords

    Predictive regression; Granger causality; persistence; stationarity test; fixed regressor wild boodstrap; conditional distribution;
    All these keywords.

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:not:notgts:17/04. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/tsnotuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.