IDEAS home Printed from https://ideas.repec.org/a/spr/advdac/v19y2025i1d10.1007_s11634-023-00574-2.html
   My bibliography  Save this article

RGA: a unified measure of predictive accuracy

Author

Listed:
  • Paolo Giudici

    (University of Pavia)

  • Emanuela Raffinetti

    (University of Pavia)

Abstract

A key point to assess statistical forecasts is the evaluation of their predictive accuracy. Recently, a new measure, called Rank Graduation Accuracy (RGA), based on the concordance between the ranks of the predicted values and the ranks of the actual values of a series of observations to be forecast, was proposed for the assessment of the quality of the predictions. In this paper, we demonstrate that, in a classification perspective, when the response to be predicted is binary, the RGA coincides both with the AUROC and the Wilcoxon-Mann–Whitney statistic, and can be employed to evaluate the accuracy of probability forecasts. When the response to be predicted is real valued, the RGA can still be applied, differently from the AUROC, and similarly to measures such as the RMSE. Differently from the RMSE, the RGA measure evaluates point predictions in terms of their ranks, rather than in terms of their values, improving robustness.

Suggested Citation

  • Paolo Giudici & Emanuela Raffinetti, 2025. "RGA: a unified measure of predictive accuracy," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 19(1), pages 67-93, March.
  • Handle: RePEc:spr:advdac:v:19:y:2025:i:1:d:10.1007_s11634-023-00574-2
    DOI: 10.1007/s11634-023-00574-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11634-023-00574-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11634-023-00574-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Juana-María Vivo & Manuel Franco & Donatella Vicari, 2018. "Rethinking an ROC partial area index for evaluating the classification performance at a high specificity range," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(3), pages 683-704, September.
    2. Ikram Chaabane & Radhouane Guermazi & Mohamed Hammami, 2020. "Enhancing techniques for learning decision trees from imbalanced data," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 14(3), pages 677-745, September.
    3. Johannes Bracher & Evan L Ray & Tilmann Gneiting & Nicholas G Reich, 2021. "Evaluating epidemic forecasts in an interval format," PLOS Computational Biology, Public Library of Science, vol. 17(2), pages 1-15, February.
    4. Tilmann Gneiting & Roopesh Ranjan, 2011. "Comparing Density Forecasts Using Threshold- and Quantile-Weighted Scoring Rules," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 29(3), pages 411-422, July.
    5. Emanuela Raffinetti, 2023. "A Rank Graduation Accuracy measure to mitigate Artificial Intelligence risks," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(2), pages 131-150, December.
    6. Gneiting, Tilmann, 2011. "Making and Evaluating Point Forecasts," Journal of the American Statistical Association, American Statistical Association, vol. 106(494), pages 746-762.
    7. Edna Schechtman & Gideon Schechtman, 2019. "The relationship between Gini terminology and the ROC curve," METRON, Springer;Sapienza Università di Roma, vol. 77(3), pages 171-178, December.
    8. Diebold, Francis X & Mariano, Roberto S, 2002. "Comparing Predictive Accuracy," Journal of Business & Economic Statistics, American Statistical Association, vol. 20(1), pages 134-144, January.
    9. Tilmann Gneiting & Larissa Stanberry & Eric Grimit & Leonhard Held & Nicholas Johnson, 2008. "Rejoinder on: Assessing probabilistic forecasts of multivariate quantities, with an application to ensemble predictions of surface winds," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(2), pages 256-264, August.
    10. Tilmann Gneiting & Larissa Stanberry & Eric Grimit & Leonhard Held & Nicholas Johnson, 2008. "Assessing probabilistic forecasts of multivariate quantities, with an application to ensemble predictions of surface winds," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 17(2), pages 211-235, August.
    11. Gneiting, Tilmann & Raftery, Adrian E., 2007. "Strictly Proper Scoring Rules, Prediction, and Estimation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 359-378, March.
    12. Gneiting, Tilmann & Ranjan, Roopesh, 2011. "Comparing Density Forecasts Using Threshold- and Quantile-Weighted Scoring Rules," Journal of Business & Economic Statistics, American Statistical Association, vol. 29(3), pages 411-422.
    13. D. J. Hand & C. Anagnostopoulos, 2023. "Notes on the H-measure of classifier performance," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 17(1), pages 109-124, March.
    14. Stanislav Vojíř & Tomáš Kliegr, 2020. "Editable machine learning models? A rule-based framework for user studies of explainability," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 14(4), pages 785-799, December.
    15. Tae-Ho Kang & Ashish Sharma & Lucy Marshall, 2021. "Assessing Goodness of Fit for Verifying Probabilistic Forecasts," Forecasting, MDPI, vol. 3(4), pages 1-11, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fabian Krüger & Sebastian Lerch & Thordis Thorarinsdottir & Tilmann Gneiting, 2021. "Predictive Inference Based on Markov Chain Monte Carlo Output," International Statistical Review, International Statistical Institute, vol. 89(2), pages 274-301, August.
    2. Gensler, André & Sick, Bernhard & Vogt, Stephan, 2018. "A review of uncertainty representations and metaverification of uncertainty assessment techniques for renewable energies," Renewable and Sustainable Energy Reviews, Elsevier, vol. 96(C), pages 352-379.
    3. Alexander, Carol & Han, Yang & Meng, Xiaochun, 2023. "Static and dynamic models for multivariate distribution forecasts: Proper scoring rule tests of factor-quantile versus multivariate GARCH models," International Journal of Forecasting, Elsevier, vol. 39(3), pages 1078-1096.
    4. Fabian Kruger & Hendrik Plett, 2022. "Prediction intervals for economic fixed-event forecasts," Papers 2210.13562, arXiv.org, revised Mar 2024.
    5. Hajo Holzmann & Matthias Eulert, 2014. "The role of the information set for forecasting - with applications to risk management," Papers 1404.7653, arXiv.org.
    6. Allen, Sam & Koh, Jonathan & Segers, Johan & Ziegel, Johanna, 2024. "Tail calibration of probabilistic forecasts," LIDAM Discussion Papers ISBA 2024018, Université catholique de Louvain, Institute of Statistics, Biostatistics and Actuarial Sciences (ISBA).
    7. Marc-Oliver Pohle, 2020. "The Murphy Decomposition and the Calibration-Resolution Principle: A New Perspective on Forecast Evaluation," Papers 2005.01835, arXiv.org.
    8. Taylor, James W. & Taylor, Kathryn S., 2023. "Combining probabilistic forecasts of COVID-19 mortality in the United States," European Journal of Operational Research, Elsevier, vol. 304(1), pages 25-41.
    9. Claudio Heinrich‐Mertsching & Thordis L. Thorarinsdottir & Peter Guttorp & Max Schneider, 2024. "Validation of point process predictions with proper scoring rules," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 51(4), pages 1533-1566, December.
    10. Petropoulos, Fotios & Apiletti, Daniele & Assimakopoulos, Vassilios & Babai, Mohamed Zied & Barrow, Devon K. & Ben Taieb, Souhaib & Bergmeir, Christoph & Bessa, Ricardo J. & Bijak, Jakub & Boylan, Joh, 2022. "Forecasting: theory and practice," International Journal of Forecasting, Elsevier, vol. 38(3), pages 705-871.
      • Fotios Petropoulos & Daniele Apiletti & Vassilios Assimakopoulos & Mohamed Zied Babai & Devon K. Barrow & Souhaib Ben Taieb & Christoph Bergmeir & Ricardo J. Bessa & Jakub Bijak & John E. Boylan & Jet, 2020. "Forecasting: theory and practice," Papers 2012.03854, arXiv.org, revised Jan 2022.
    11. Werner Ehm & Tilmann Gneiting & Alexander Jordan & Fabian Krüger, 2016. "Of quantiles and expectiles: consistent scoring functions, Choquet representations and forecast rankings," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(3), pages 505-562, June.
    12. Diks, Cees & Fang, Hao, 2020. "Comparing density forecasts in a risk management context," International Journal of Forecasting, Elsevier, vol. 36(2), pages 531-551.
    13. Tobias Fissler & Hajo Holzmann, 2022. "Measurability of functionals and of ideal point forecasts," Papers 2203.08635, arXiv.org.
    14. Jonas R. Brehmer & Tilmann Gneiting & Marcus Herrmann & Warner Marzocchi & Martin Schlather & Kirstin Strokorb, 2024. "Comparative evaluation of point process forecasts," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 76(1), pages 47-71, February.
    15. Chuliá, Helena & Garrón, Ignacio & Uribe, Jorge M., 2024. "Daily growth at risk: Financial or real drivers? The answer is not always the same," International Journal of Forecasting, Elsevier, vol. 40(2), pages 762-776.
    16. Kelly Trinh & Bo Zhang & Chenghan Hou, 2025. "Macroeconomic real‐time forecasts of univariate models with flexible error structures," Journal of Forecasting, John Wiley & Sons, Ltd., vol. 44(1), pages 59-78, January.
    17. Angelica Gianfreda & Francesco Ravazzolo & Luca Rossini, 2020. "Large Time-Varying Volatility Models for Electricity Prices," Working Papers No 05/2020, Centre for Applied Macro- and Petroleum economics (CAMP), BI Norwegian Business School.
    18. Magnus Reif, 2020. "Macroeconomics, Nonlinearities, and the Business Cycle," ifo Beiträge zur Wirtschaftsforschung, ifo Institute - Leibniz Institute for Economic Research at the University of Munich, number 87, May.
    19. Hauzenberger, Niko & Pfarrhofer, Michael & Rossini, Luca, 2025. "Sparse time-varying parameter VECMs with an application to modeling electricity prices," International Journal of Forecasting, Elsevier, vol. 41(1), pages 361-376.
    20. Florian Ziel & Kevin Berk, 2019. "Multivariate Forecasting Evaluation: On Sensitive and Strictly Proper Scoring Rules," Papers 1910.07325, arXiv.org.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:advdac:v:19:y:2025:i:1:d:10.1007_s11634-023-00574-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.