IDEAS home Printed from https://ideas.repec.org/p/arx/papers/2505.21278.html
   My bibliography  Save this paper

Conditional Method Confidence Set

Author

Listed:
  • Lukas Bauer
  • Ekaterina Kazak

Abstract

This paper proposes a Conditional Method Confidence Set (CMCS) which allows to select the best subset of forecasting methods with equal predictive ability conditional on a specific economic regime. The test resembles the Model Confidence Set by Hansen et al. (2011) and is adapted for conditional forecast evaluation. We show the asymptotic validity of the proposed test and illustrate its properties in a simulation study. The proposed testing procedure is particularly suitable for stress-testing of financial risk models required by the regulators. We showcase the empirical relevance of the CMCS using the stress-testing scenario of Expected Shortfall. The empirical evidence suggests that the proposed CMCS procedure can be used as a robust tool for forecast evaluation of market risk models for different economic regimes.

Suggested Citation

  • Lukas Bauer & Ekaterina Kazak, 2025. "Conditional Method Confidence Set," Papers 2505.21278, arXiv.org.
  • Handle: RePEc:arx:papers:2505.21278
    as

    Download full text from publisher

    File URL: http://arxiv.org/pdf/2505.21278
    File Function: Latest version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Jia Li & Zhipeng Liao & Mengsi Gao, 2020. "Uniform nonparametric inference for time series using Stata," Stata Journal, StataCorp LLC, vol. 20(3), pages 706-720, September.
    2. Daniel Borup & Martin Thyrsgaard, 2017. "Statistical tests for equal predictive ability across multiple forecasting methods," CREATES Research Papers 2017-19, Department of Economics and Business Economics, Aarhus University.
    3. Raffaella Giacomini & Halbert White, 2006. "Tests of Conditional Predictive Ability," Econometrica, Econometric Society, vol. 74(6), pages 1545-1578, November.
    4. Ellis, Scott & Sharma, Satish & Brzeszczyński, Janusz, 2022. "Systemic risk measures and regulatory challenges," Journal of Financial Stability, Elsevier, vol. 61(C).
    5. Raffaella Giacomini & Barbara Rossi, 2010. "Forecast comparisons in unstable environments," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 25(4), pages 595-620.
    6. Daniel Borup & Jonas N. Eriksen & Mads M. Kjær & Martin Thyrsgaard, 2024. "Predicting Bond Return Predictability," Management Science, INFORMS, vol. 70(2), pages 931-951, February.
    7. Giovanni Barone‐Adesi & Kostas Giannopoulos & Les Vosper, 1999. "VaR without correlations for portfolios of derivative securities," Journal of Futures Markets, John Wiley & Sons, Ltd., vol. 19(5), pages 583-602, August.
    8. White,Halbert, 1996. "Estimation, Inference and Specification Analysis," Cambridge Books, Cambridge University Press, number 9780521574464, September.
    9. Diebold, Francis X & Mariano, Roberto S, 2002. "Comparing Predictive Accuracy," Journal of Business & Economic Statistics, American Statistical Association, vol. 20(1), pages 134-144, January.
    10. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    11. Peter R. Hansen & Asger Lunde & James M. Nason, 2011. "The Model Confidence Set," Econometrica, Econometric Society, vol. 79(2), pages 453-497, March.
    12. Jia Li & Zhipeng Liao & Rogier Quaedvlieg, 2022. "Conditional Superior Predictive Ability," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 89(2), pages 843-875.
    13. Taylor, James W., 2020. "Forecast combinations for value at risk and expected shortfall," International Journal of Forecasting, Elsevier, vol. 36(2), pages 428-441.
    14. Andrew J. Patton, 2020. "Comparing Possibly Misspecified Forecasts," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 38(4), pages 796-809, October.
    15. Gourieroux, C. & Monfort, A., 2021. "Model risk management: Valuation and governance of pseudo-models," Econometrics and Statistics, Elsevier, vol. 17(C), pages 1-22.
    16. White, Halbert & Domowitz, Ian, 1984. "Nonlinear Regression with Dependent Observations," Econometrica, Econometric Society, vol. 52(1), pages 143-161, January.
    17. Li, Jia & Liao, Zhipeng, 2020. "Uniform nonparametric inference for time series," Journal of Econometrics, Elsevier, vol. 219(1), pages 38-51.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daniel Borup & Jonas N. Eriksen & Mads M. Kjær & Martin Thyrsgaard, 2024. "Predicting Bond Return Predictability," Management Science, INFORMS, vol. 70(2), pages 931-951, February.
    2. Oh, Dong Hwan & Patton, Andrew J., 2024. "Better the devil you know: Improved forecasts from imperfect models," Journal of Econometrics, Elsevier, vol. 242(1).
    3. Dimitriadis, Timo & Schnaitmann, Julie, 2021. "Forecast encompassing tests for the expected shortfall," International Journal of Forecasting, Elsevier, vol. 37(2), pages 604-621.
    4. David T. Frazier & Donald S. Poskitt, 2025. "Sequential Scoring Rule Evaluation for Forecast Method Selection," Papers 2505.09090, arXiv.org.
    5. David I. Harvey & Stephen J. Leybourne & Yang Zu, 2024. "Tests for equal forecast accuracy under heteroskedasticity," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 39(5), pages 850-869, August.
    6. Lukas Bauer, 2025. "Evaluating financial tail risk forecasts: Testing Equal Predictive Ability," Papers 2505.23333, arXiv.org.
    7. Daniel Borup & Martin Thyrsgaard, 2017. "Statistical tests for equal predictive ability across multiple forecasting methods," CREATES Research Papers 2017-19, Department of Economics and Business Economics, Aarhus University.
    8. Tobias Fissler & Yannick Hoga, 2024. "How to Compare Copula Forecasts?," Papers 2410.04165, arXiv.org.
    9. Matei Demetrescu & Christoph Hanck & Robinson Kruse‐Becher, 2022. "Robust inference under time‐varying volatility: A real‐time evaluation of professional forecasters," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 37(5), pages 1010-1030, August.
    10. Emilio Zanetti Chini, 2013. "Generalizing smooth transition autoregressions," CREATES Research Papers 2013-32, Department of Economics and Business Economics, Aarhus University.
    11. Laura Garcia‐Jorcano & Alfonso Novales, 2021. "Volatility specifications versus probability distributions in VaR forecasting," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 40(2), pages 189-212, March.
    12. Zanetti Chini, Emilio, 2018. "Forecasting dynamically asymmetric fluctuations of the U.S. business cycle," International Journal of Forecasting, Elsevier, vol. 34(4), pages 711-732.
    13. Barbara Rossi, 2021. "Forecasting in the Presence of Instabilities: How We Know Whether Models Predict Well and How to Improve Them," Journal of Economic Literature, American Economic Association, vol. 59(4), pages 1135-1190, December.
    14. Michael W. McCracken, 2020. "Diverging Tests of Equal Predictive Ability," Econometrica, Econometric Society, vol. 88(4), pages 1753-1754, July.
    15. Fissler Tobias & Ziegel Johanna F., 2021. "On the elicitability of range value at risk," Statistics & Risk Modeling, De Gruyter, vol. 38(1-2), pages 25-46, January.
    16. Rafal Weron & Florian Ziel, 2018. "Electricity price forecasting," HSC Research Reports HSC/18/08, Hugo Steinhaus Center, Wroclaw University of Science and Technology.
    17. Malte Knuppel & Fabian Kruger & Marc-Oliver Pohle, 2022. "Score-based calibration testing for multivariate forecast distributions," Papers 2211.16362, arXiv.org, revised Dec 2023.
    18. Jie Cheng, 2024. "Evaluating Density Forecasts Using Weighted Multivariate Scores in a Risk Management Context," Computational Economics, Springer;Society for Computational Economics, vol. 64(6), pages 3617-3643, December.
    19. Valentina Corradi & Sainan Jin & Norman R. Swanson, 2023. "Robust forecast superiority testing with an application to assessing pools of expert forecasters," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(4), pages 596-622, June.
    20. Diks, Cees & Panchenko, Valentyn & Sokolinskiy, Oleg & van Dijk, Dick, 2014. "Comparing the accuracy of multivariate density forecasts in selected regions of the copula support," Journal of Economic Dynamics and Control, Elsevier, vol. 48(C), pages 79-94.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:arx:papers:2505.21278. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: arXiv administrators (email available below). General contact details of provider: http://arxiv.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.