IDEAS home Printed from https://ideas.repec.org/a/taf/jnlbes/v38y2020i4p796-809.html
   My bibliography  Save this article

Comparing Possibly Misspecified Forecasts

Author

Listed:
  • Andrew J. Patton

Abstract

Recent work has emphasized the importance of evaluating estimates of a statistical functional (such as a conditional mean, quantile, or distribution) using a loss function that is consistent for the functional of interest, of which there is an infinite number. If forecasters all use correctly specified models free from estimation error, and if the information sets of competing forecasters are nested, then the ranking induced by a single consistent loss function is sufficient for the ranking by any consistent loss function. This article shows, via analytical results and realistic simulation-based analyses, that the presence of misspecified models, parameter estimation error, or nonnested information sets, leads generally to sensitivity to the choice of (consistent) loss function. Thus, rather than merely specifying the target functional, which narrows the set of relevant loss functions only to the class of loss functions consistent for that functional, forecast consumers or survey designers should specify the single specific loss function that will be used to evaluate forecasts. An application to survey forecasts of U.S. inflation illustrates the results.

Suggested Citation

  • Andrew J. Patton, 2020. "Comparing Possibly Misspecified Forecasts," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(4), pages 796-809, October.
  • Handle: RePEc:taf:jnlbes:v:38:y:2020:i:4:p:796-809
    DOI: 10.1080/07350015.2019.1585256
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1080/07350015.2019.1585256
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1080/07350015.2019.1585256?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Xenxo Vidal-Llana & Carlos Salort Sánchez & Vincenzo Coia & Montserrat Guillen, 2022. ""Non-Crossing Dual Neural Network: Joint Value at Risk and Conditional Tail Expectation estimations with non-crossing conditions"," IREA Working Papers 202215, University of Barcelona, Research Institute of Applied Economics, revised Oct 2022.
    2. Llorens-Terrazas, Jordi & Brownlees, Christian, 2023. "Projected Dynamic Conditional Correlations," International Journal of Forecasting, Elsevier, vol. 39(4), pages 1761-1776.
    3. Patrick Schmidt & Matthias Katzfuss & Tilmann Gneiting, 2021. "Interpretation of point forecasts with unknown directive," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 36(6), pages 728-743, September.
    4. Alexander Henzi & Johanna F Ziegel, 2022. "Valid sequential inference on probability forecast performance [A comparison of the ECMWF, MSC, and NCEP global ensemble prediction systems]," Biometrika, Biometrika Trust, vol. 109(3), pages 647-663.
    5. Fissler Tobias & Ziegel Johanna F., 2021. "On the elicitability of range value at risk," Statistics & Risk Modeling, De Gruyter, vol. 38(1-2), pages 25-46, January.
    6. Tobias Fissler & Jana Hlavinová & Birgit Rudloff, 2021. "Elicitability and identifiability of set-valued measures of systemic risk," Finance and Stochastics, Springer, vol. 25(1), pages 133-165, January.
    7. Denuit, Michel & Trufin, Julien, 2022. "Autocalibration by balance correction in nonlife insurance pricing," LIDAM Discussion Papers ISBA 2022041, Université catholique de Louvain, Institute of Statistics, Biostatistics and Actuarial Sciences (ISBA).
    8. Takaaki Koike & Cathy W. S. Chen & Edward M. H. Lin, 2024. "Forecasting and Backtesting Gradient Allocations of Expected Shortfall," Papers 2401.11701, arXiv.org.
    9. Alexander I. Jordan & Anja Mühlemann & Johanna F. Ziegel, 2022. "Characterizing the optimal solutions to the isotonic regression problem for identifiable functionals," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 74(3), pages 489-514, June.
    10. Cathy W. S. Chen & Takaaki Koike & Wei-Hsuan Shau, 2024. "Tail risk forecasting with semi-parametric regression models by incorporating overnight information," Papers 2402.07134, arXiv.org.
    11. Timo Dimitriadis & Andrew J. Patton & Patrick W. Schmidt, 2019. "Testing Forecast Rationality for Measures of Central Tendency," Papers 1910.12545, arXiv.org, revised Jun 2023.
    12. Charles F. Manski, 2021. "Econometrics for Decision Making: Building Foundations Sketched by Haavelmo and Wald," Econometrica, Econometric Society, vol. 89(6), pages 2827-2853, November.
    13. Lazar, Emese & Wang, Shixuan & Xue, Xiaohan, 2023. "Loss function-based change point detection in risk measures," European Journal of Operational Research, Elsevier, vol. 310(1), pages 415-431.
    14. Dong Hwan Oh & Andrew J. Patton, 2021. "Better the Devil You Know: Improved Forecasts from Imperfect Models," Finance and Economics Discussion Series 2021-071, Board of Governors of the Federal Reserve System (U.S.).
    15. Mucahit Aygun & Fabio Bellini & Roger J. A. Laeven, 2023. "Elicitability of Return Risk Measures," Papers 2302.13070, arXiv.org, revised Mar 2023.
    16. Tobias Fissler & Yannick Hoga, 2021. "Backtesting Systemic Risk Forecasts using Multi-Objective Elicitability," Papers 2104.10673, arXiv.org, revised Feb 2022.
    17. Boskabadi, Elahe, 2022. "Economic policy uncertainty and forecast bias in the survey of professional forecasters," MPRA Paper 115081, University Library of Munich, Germany.
    18. Valentina Corradi & Sainan Jin & Norman R. Swanson, 2023. "Robust forecast superiority testing with an application to assessing pools of expert forecasters," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(4), pages 596-622, June.
    19. Yen, Yu-Min & Yen, Tso-Jung, 2021. "Testing forecast accuracy of expectiles and quantiles with the extremal consistent loss functions," International Journal of Forecasting, Elsevier, vol. 37(2), pages 733-758.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:jnlbes:v:38:y:2020:i:4:p:796-809. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/UBES20 .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.