Seasonal adjustment with 'bad' neural networks: a case of benign underfitting
This paper is a response to reports by a number of authors of difficulties in using feedforward neural networks (NNs) in the presence of seasonality. Such difficulties appear to contradict the well-known property of 'universal approximation', but arise in fact because the property only applies to suitably smooth functions. We report on what is an unexpected benefit of this limitation of NNs: although the network is unable to follow seasonal movements it can identify an underlying trend, through being forced to steer a middle course. We achieve such deseasonalisation through network pruning, which restricts the approximation capacity of the network by depriving it of connections. Although pruning is in general motivated by the search for the most parsimonious network, in our particular context it forces the network to ignore both seasonal movements and noise.
Volume (Year): 2 (2010)
Issue (Month): 4 ()
|Contact details of provider:|| Web page: http://www.inderscience.com/browse/index.php?journalID=286|
When requesting a correction, please mention this item's handle: RePEc:ids:injams:v:2:y:2010:i:4:p:335-350. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Graham Langley)
If references are entirely missing, you can add them using this form.