IDEAS home Printed from https://ideas.repec.org/a/bkr/journl/v83y2024i1p53-76.html
   My bibliography  Save this article

Forecasting Inflation in Russia Using Gradient Boosting and Neural Networks

Author

Listed:
  • Urmat Dzhunkeev

    (RANEPA; Lomonosov Moscow State University)

Abstract

The aim of this paper is to estimate the efficiency of forecasting inflation in Russia using machine learning methods such as gradient boosting algorithms and neural networks. This is the first paper in which long short-term memory (LSTM) and gated recurrent unit (GRU) models are used to forecast inflation in Russia. In addition, I test modified versions of gradient boosting such as LightGBM and CatBoost. With a sample of lagged inflation values, the most accurate forecasts are obtained using convolutional neural networks (CNNs) and fully connected neural network (FCNNs), and when forecasting over a twelve-month horizon, using the LSTM model, which is associated with sequential information processing and the gating mechanism in statistical data analysis. When additional macroeconomic factors are taken into account, FCNNs and the Sklearn gradient boosting model demonstrate a predictive advantage. As per the Shapley decomposition, the most informative predictors for forecasting Russian inflation are oil and natural gas prices, inflation in the euro area and the United States, retail trade turnover dynamics, and the unemployment growth.

Suggested Citation

  • Urmat Dzhunkeev, 2024. "Forecasting Inflation in Russia Using Gradient Boosting and Neural Networks," Russian Journal of Money and Finance, Bank of Russia, vol. 83(1), pages 53-76, March.
  • Handle: RePEc:bkr:journl:v:83:y:2024:i:1:p:53-76
    as

    Download full text from publisher

    File URL: https://rjmf.econs.online/upload/iblock/05d/4sg4s9cgcnnenqs310mes07fietsztn2/Forecasting-Inflation-in-Russia-Using-Gradient-Boosting-and-Neural-Networks.pdf
    Download Restriction: no
    ---><---

    More about this item

    Keywords

    inflation forecasting; machine learning; gradient boosting; neural networks; Shapley value;
    All these keywords.

    JEL classification:

    • C52 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Model Evaluation, Validation, and Selection
    • C53 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Forecasting and Prediction Models; Simulation Methods
    • E37 - Macroeconomics and Monetary Economics - - Prices, Business Fluctuations, and Cycles - - - Forecasting and Simulation: Models and Applications

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bkr:journl:v:83:y:2024:i:1:p:53-76. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Olga Kuvshinova (email available below). General contact details of provider: https://edirc.repec.org/data/cbrgvru.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.