IDEAS home Printed from https://ideas.repec.org/p/pra/mprapa/127449.html

A Proposal for a Unified Forecast Accuracy Index (UFAI): Toward Multidimensional and Context-Aware Forecast Evaluation

Author

Listed:
  • Chellai, Fatih

Abstract

Forecast accuracy evaluation is a cornerstone in fields as diverse as finance, public health, energy, and meteorology. However, traditional reliance on single-error metrics—such as MAE, RMSE, or MAPE—offers only a fragmented view of a model’s performance, often obscuring critical dimensions like systematic bias, volatility, directional behavior, or shape fidelity. To overcome these limitations, this study proposes the Unified Forecast Accuracy Index (UFAI), a multidimensional and composite metric that consolidates several facets of forecasting quality into a single, interpretable score. UFAI integrates four normalized sub-indices—bias, variance, directional accuracy, and shape preservation—each capturing a distinct performance characteristic. The framework accommodates multiple weighting schemes: equal weighting for simplicity, expert-informed weighting to reflect domain-specific priorities, and data-driven weighting based on statistical principles such as Principal Component Analysis and entropy measures. This flexibility enables users to adapt the index to diverse forecasting objectives and application contexts. The article details the mathematical formulation of each sub-index, discusses the theoretical soundness and practical implications of different weighting strategies, and demonstrates the utility of UFAI through comparative model evaluations. Emphasis is placed on the index’s normalization, interpretability, robustness to outliers, and extensibility to future use cases such as multi-horizon and probabilistic forecasts. By offering a more integrated and context-aware assessment tool, the UFAI marks a significant advancement in forecast evaluation methodology, supporting more reliable model selection and ultimately enhancing decision-making in data-driven environments.

Suggested Citation

  • Chellai, Fatih, 2025. "A Proposal for a Unified Forecast Accuracy Index (UFAI): Toward Multidimensional and Context-Aware Forecast Evaluation," MPRA Paper 127449, University Library of Munich, Germany.
  • Handle: RePEc:pra:mprapa:127449
    as

    Download full text from publisher

    File URL: https://mpra.ub.uni-muenchen.de/127449/1/MPRA_paper_127449.pdf
    File Function: original version
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Pesaran, M Hashem & Timmermann, Allan, 1992. "A Simple Nonparametric Test of Predictive Performance," Journal of Business & Economic Statistics, American Statistical Association, vol. 10(4), pages 561-565, October.
    2. Hyndman, Rob J. & Koehler, Anne B., 2006. "Another look at measures of forecast accuracy," International Journal of Forecasting, Elsevier, vol. 22(4), pages 679-688.
    3. Makridakis, Spyros & Hibon, Michele, 2000. "The M3-Competition: results, conclusions and implications," International Journal of Forecasting, Elsevier, vol. 16(4), pages 451-476.
    4. Spyros Makridakis & Evangelos Spiliotis & Vassilios Assimakopoulos, 2018. "Statistical and Machine Learning forecasting methods: Concerns and ways forward," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-26, March.
    5. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    6. James H. Stock & Mark W. Watson, 2001. "Vector Autoregressions," Journal of Economic Perspectives, American Economic Association, vol. 15(4), pages 101-115, Fall.
    7. Armstrong, J. Scott & Collopy, Fred, 1992. "Error measures for generalizing about forecasting methods: Empirical comparisons," International Journal of Forecasting, Elsevier, vol. 8(1), pages 69-80, June.
    8. Makridakis, Spyros, 1993. "Accuracy measures: theoretical and practical concerns," International Journal of Forecasting, Elsevier, vol. 9(4), pages 527-529, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Makridakis, Spyros & Spiliotis, Evangelos & Assimakopoulos, Vassilios, 2020. "The M4 Competition: 100,000 time series and 61 forecasting methods," International Journal of Forecasting, Elsevier, vol. 36(1), pages 54-74.
    2. Jennifer L. Castle & Jurgen A. Doornik & David F. Hendry, 2021. "Forecasting Principles from Experience with Forecasting Competitions," Forecasting, MDPI, vol. 3(1), pages 1-28, February.
    3. Jennifer L. Castle & Jurgen A. Doornik & David Hendry, 2019. "Some forecasting principles from the M4 competition," Economics Papers 2019-W01, Economics Group, Nuffield College, University of Oxford.
    4. Blaskowitz, Oliver & Herwartz, Helmut, 2011. "On economic evaluation of directional forecasts," International Journal of Forecasting, Elsevier, vol. 27(4), pages 1058-1065, October.
    5. Nicholas G. Reich & Justin Lessler & Krzysztof Sakrejda & Stephen A. Lauer & Sopon Iamsirithaworn & Derek A. T. Cummings, 2016. "Case Study in Evaluating Time Series Prediction Models Using the Relative Mean Absolute Error," The American Statistician, Taylor & Francis Journals, vol. 70(3), pages 285-292, July.
    6. Makridakis, Spyros & Hyndman, Rob J. & Petropoulos, Fotios, 2020. "Forecasting in social settings: The state of the art," International Journal of Forecasting, Elsevier, vol. 36(1), pages 15-28.
    7. George Athanasopoulos & Nikolaos Kourentzes, 2021. "On the Evaluation of Hierarchical Forecasts," Monash Econometrics and Business Statistics Working Papers 10/21, Monash University, Department of Econometrics and Business Statistics.
    8. Semenoglou, Artemios-Anargyros & Spiliotis, Evangelos & Makridakis, Spyros & Assimakopoulos, Vassilios, 2021. "Investigating the accuracy of cross-learning time series forecasting methods," International Journal of Forecasting, Elsevier, vol. 37(3), pages 1072-1084.
    9. George Athanasopoulos & Nikolaos Kourentzes, 2020. "On the Evaluation of Hierarchical Forecasts," Monash Econometrics and Business Statistics Working Papers 2/20, Monash University, Department of Econometrics and Business Statistics.
    10. Spiliotis, Evangelos & Nikolopoulos, Konstantinos & Assimakopoulos, Vassilios, 2019. "Tales from tails: On the empirical distributions of forecasting errors and their implication to risk," International Journal of Forecasting, Elsevier, vol. 35(2), pages 687-698.
    11. Wen, Xin & Jaxa-Rozen, Marc & Trutnevyte, Evelina, 2022. "Accuracy indicators for evaluating retrospective performance of energy system models," Applied Energy, Elsevier, vol. 325(C).
    12. Makridakis, Spyros & Spiliotis, Evangelos & Assimakopoulos, Vassilios, 2022. "M5 accuracy competition: Results, findings, and conclusions," International Journal of Forecasting, Elsevier, vol. 38(4), pages 1346-1364.
    13. Athanasopoulos, George & Kourentzes, Nikolaos, 2023. "On the evaluation of hierarchical forecasts," International Journal of Forecasting, Elsevier, vol. 39(4), pages 1502-1511.
    14. Hewamalage, Hansika & Bergmeir, Christoph & Bandara, Kasun, 2021. "Recurrent Neural Networks for Time Series Forecasting: Current status and future directions," International Journal of Forecasting, Elsevier, vol. 37(1), pages 388-427.
    15. Kourentzes, Nikolaos & Athanasopoulos, George, 2021. "Elucidate structure in intermittent demand series," European Journal of Operational Research, Elsevier, vol. 288(1), pages 141-152.
    16. Alysha M De Livera, 2010. "Automatic forecasting with a modified exponential smoothing state space framework," Monash Econometrics and Business Statistics Working Papers 10/10, Monash University, Department of Econometrics and Business Statistics.
    17. Philippe St-Aubin & Bruno Agard, 2022. "Precision and Reliability of Forecasts Performance Metrics," Forecasting, MDPI, vol. 4(4), pages 1-22, October.
    18. Li, Li & Kang, Yanfei & Li, Feng, 2023. "Bayesian forecast combination using time-varying features," International Journal of Forecasting, Elsevier, vol. 39(3), pages 1287-1302.
    19. Dean W. Wichern & Benito E. Flores, 2005. "Evaluating forecasts: a look at aggregate bias and accuracy measures," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 24(6), pages 433-451.
    20. Snyder, Ralph D. & Ord, J. Keith & Beaumont, Adrian, 2012. "Forecasting the intermittent demand for slow-moving inventories: A modelling approach," International Journal of Forecasting, Elsevier, vol. 28(2), pages 485-496.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    JEL classification:

    • C1 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General
    • C2 - Mathematical and Quantitative Methods - - Single Equation Models; Single Variables
    • C4 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:pra:mprapa:127449. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Joachim Winter (email available below). General contact details of provider: https://edirc.repec.org/data/vfmunde.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.