Author
Listed:
- Kristóf Németh
- Dániel Hadházi
Abstract
We apply artificial neural networks (ANNs) to nowcast quarterly GDP growth for the US economy. Using the monthly FRED‐MD database, we compare the nowcasting performance of five different ANN architectures: the multilayer perceptron (MLP), the one‐dimensional convolutional neural network (1D CNN), the Elman recurrent neural network (RNN), the long short‐term memory (LSTM) network, and the gated recurrent unit (GRU). The empirical analysis presents results from two distinctively different evaluation periods. The first (2012:Q1–2019:Q4) is characterized by balanced economic growth, while the second (2012:Q1–2024:Q2) also includes periods of the COVID‐19 recession. During the first evaluation period, longer input sequences slightly improve nowcasting performance for some ANNs, but the best accuracy is still achieved with 8‐month‐long input sequences at the end of the nowcasting window. Results from the second test period depict the role of long‐term memory even more clearly. The MLP, the 1D CNN, and the Elman RNN work best with 8‐month‐long input sequences at each step of the nowcasting window. The relatively weak performance of the gated RNNs also suggests that architectural features enabling long‐term memory do not result in more accurate nowcasts for GDP growth. The combined results indicate that the 1D CNN seems to represent a “sweet spot” between the simple time‐agnostic MLP and the more complex (gated) RNNs. The network generates nearly as accurate nowcasts as the best competitor for the first test period, while it achieves the overall best accuracy during the second evaluation period. Consequently, as a first in the literature, we propose the application of the 1D CNN for economic nowcasting.
Suggested Citation
Kristóf Németh & Dániel Hadházi, 2026.
"GDP Nowcasting With Artificial Neural Networks: How Much Does Long‐Term Memory Matter?,"
Journal of Forecasting, John Wiley & Sons, Ltd., vol. 45(3), pages 924-963, April.
Handle:
RePEc:wly:jforec:v:45:y:2026:i:3:p:924-963
DOI: 10.1002/for.70061
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:wly:jforec:v:45:y:2026:i:3:p:924-963. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: http://www3.interscience.wiley.com/cgi-bin/jhome/2966 .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.