IDEAS home Printed from https://ideas.repec.org/a/eee/intfor/v38y2022i4p1426-1433.html
   My bibliography  Save this article

Forecasting with gradient boosted trees: augmentation, tuning, and cross-validation strategies

Author

Listed:
  • Lainder, A. David
  • Wolfinger, Russell D.

Abstract

Deep neural networks and gradient boosted tree models have swept across the field of machine learning over the past decade, producing across-the-board advances in performance. The ability of these methods to capture feature interactions and nonlinearities makes them exceptionally powerful and, at the same time, prone to overfitting, leakage, and a lack of generalization in domains with target non-stationarity and collinearity, such as time-series forecasting. We offer guidance to address these difficulties and provide a framework that maximizes the chances of predictions that generalize well and deliver state-of-the-art performance. The techniques we offer for cross-validation, augmentation, and parameter tuning have been used to win several major time-series forecasting competitions—including the M5 Forecasting Uncertainty competition and the Kaggle COVID19 Forecasting series—and, with the proper theoretical grounding, constitute the current best practices in time-series forecasting.

Suggested Citation

  • Lainder, A. David & Wolfinger, Russell D., 2022. "Forecasting with gradient boosted trees: augmentation, tuning, and cross-validation strategies," International Journal of Forecasting, Elsevier, vol. 38(4), pages 1426-1433.
  • Handle: RePEc:eee:intfor:v:38:y:2022:i:4:p:1426-1433
    DOI: 10.1016/j.ijforecast.2021.12.003
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0169207021002090
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.ijforecast.2021.12.003?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:intfor:v:38:y:2022:i:4:p:1426-1433. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/ijforecast .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.