IDEAS home Printed from https://ideas.repec.org/p/cdf/wpaper/2014-11.html
   My bibliography  Save this paper

How good are out of sample forecasting Tests on DSGE models?

Author

Listed:

Abstract

Out-of-sample forecasting tests of DSGE models against time-series benchmarks such as an unrestricted VAR are increasingly used to check a) the specification b) the forecasting capacity of these models. We carry out a Monte Carlo experiment on a widely-used DSGE model to investigate the power of these tests. We find that in specification testing they have weak power relative to an in-sample indirect inference test,this implies that a DSGE model may be badly mis-specified and still improve forecasts from an unrestricted VAR. In testing forecasting capacity they also have quite weak power, particularly on the lefthand tail. By contrast a model that passes an indirect inference test of specification will almost definitely also improve on VAR forecasts.

Suggested Citation

  • Minford, Patrick & Xu, Yongdeng & Zhou, Peng, 2014. "How good are out of sample forecasting Tests on DSGE models?," Cardiff Economics Working Papers E2014/11, Cardiff University, Cardiff Business School, Economics Section.
  • Handle: RePEc:cdf:wpaper:2014/11
    as

    Download full text from publisher

    File URL: http://carbsecon.com/wp/E2014_11.pdf
    Download Restriction: no

    Other versions of this item:

    References listed on IDEAS

    as
    1. Kenneth S. Rogoff & Vania Stavrakeva, 2008. "The Continuing Puzzle of Short Horizon Exchange Rate Forecasting," NBER Working Papers 14071, National Bureau of Economic Research, Inc.
    2. Frank Smets & Rafael Wouters, 2007. "Shocks and Frictions in US Business Cycles: A Bayesian DSGE Approach," American Economic Review, American Economic Association, vol. 97(3), pages 586-606, June.
    3. Ince, Onur, 2014. "Forecasting exchange rates out-of-sample with panel methods and real-time data," Journal of International Money and Finance, Elsevier, vol. 43(C), pages 1-18.
    4. Le, Vo Phuong Mai & Meenagh, David & Minford, Patrick, 2012. "What causes banking crises? An empirical investigation," Cardiff Economics Working Papers E2012/14, Cardiff University, Cardiff Business School, Economics Section, revised Apr 2013.
    5. Clark, Todd E. & West, Kenneth D., 2006. "Using out-of-sample mean squared prediction errors to test the martingale difference hypothesis," Journal of Econometrics, Elsevier, vol. 135(1-2), pages 155-186.
    6. West, Kenneth D, 1996. "Asymptotic Inference about Predictive Ability," Econometrica, Econometric Society, vol. 64(5), pages 1067-1084, September.
    7. Le, Vo Phuong Mai & Meenagh, David & Minford, Patrick & Wickens, Michael, 2012. "Testing DSGE models by Indirect inference and other methods: some Monte Carlo experiments," Cardiff Economics Working Papers E2012/15, Cardiff University, Cardiff Business School, Economics Section.
    8. Christoffel, Kai & Warne, Anders & Coenen, Günter, 2010. "Forecasting with DSGE models," Working Paper Series 1185, European Central Bank.
    9. Raffaella Giacomini & Barbara Rossi, 2010. "Forecast comparisons in unstable environments," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 25(4), pages 595-620.
    10. Michael Wickens, 2014. "How Useful are DSGE Macroeconomic Models for Forecasting?," Open Economies Review, Springer, vol. 25(1), pages 171-193, February.
    11. Michael P. Clements & David F. Hendry, 2005. "Evaluating a Model by Forecast Performance," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 67(s1), pages 931-956, December.
    12. Le, Vo Phuong Mai & Meenagh, David & Minford, Patrick & Wickens, Michael, 2011. "How much nominal rigidity is there in the US economy? Testing a new Keynesian DSGE model using indirect inference," Journal of Economic Dynamics and Control, Elsevier, vol. 35(12), pages 2078-2104.
    13. Diebold, Francis X & Mariano, Roberto S, 2002. "Comparing Predictive Accuracy," Journal of Business & Economic Statistics, American Statistical Association, vol. 20(1), pages 134-144, January.
    14. Clark, Todd E. & West, Kenneth D., 2007. "Approximately normal tests for equal predictive accuracy in nested models," Journal of Econometrics, Elsevier, vol. 138(1), pages 291-311, May.
    15. Gürkaynak, Refet S. & Kisacikoglu, Burçin & Rossi, Barbara, 2013. "Do DSGE Models Forecast More Accurately Out-of-Sample than VAR Models?," CEPR Discussion Papers 9576, C.E.P.R. Discussion Papers.
    16. Rochelle M. Edge & Michael T. Kiley & Jean-Philippe Laforte, 2010. "A comparison of forecast performance between Federal Reserve staff forecasts, simple reduced-form models, and a DSGE model," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 25(4), pages 720-754.
    17. Rochelle M. Edge & Refet S. Gurkaynak, 2010. "How Useful Are Estimated DSGE Model Forecasts for Central Bankers?," Brookings Papers on Economic Activity, Economic Studies Program, The Brookings Institution, vol. 41(2 (Fall)), pages 209-259.
    18. Michel Juillard, 2001. "DYNARE: A program for the simulation of rational expectation models," Computing in Economics and Finance 2001 213, Society for Computational Economics.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Minford, Patrick & Wang, Yi & Zhou, Peng, 2017. "Resolving the Public Sector Wage Premium Puzzle by Indirect Inference," Cardiff Economics Working Papers E2017/13, Cardiff University, Cardiff Business School, Economics Section.
    2. Meenagh, David & Minford, Patrick & Wickens, Michael & Xu, Yongdeng, 2018. "Testing DSGE Models by indirect inference: a survey of recent findings," Cardiff Economics Working Papers E2018/14, Cardiff University, Cardiff Business School, Economics Section.
    3. Loberto, Michele & Perricone, Chiara, 2017. "Does trend inflation make a difference?," Economic Modelling, Elsevier, vol. 61(C), pages 351-375.

    More about this item

    Keywords

    Out of sample forecasts; DSGE; VAR; specification tests; indirect inference; forecast performance;

    JEL classification:

    • E10 - Macroeconomics and Monetary Economics - - General Aggregative Models - - - General
    • E17 - Macroeconomics and Monetary Economics - - General Aggregative Models - - - Forecasting and Simulation: Models and Applications

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:cdf:wpaper:2014/11. See general information about how to correct material in RePEc.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Yongdeng Xu). General contact details of provider: http://edirc.repec.org/data/ecscfuk.html .

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service hosted by the Research Division of the Federal Reserve Bank of St. Louis . RePEc uses bibliographic data supplied by the respective publishers.