An Evolutionary Bootstarp Approach to Neural Network Pruning and Generalization
This paper combines techniques drawn from the literature on evolutionary optimization algorithms along with bootstrap based statistical tests. Bootstrapping is used as a general framework for estimating objectives out of sample by redrawing subsets from a training sample. Evolution is used to search the large number of potential network architectures. The combination of these two methods creates a network estimation and selection procedure which finds parsimonious network structures which generalize well. The bootstrap methodology also allows for objective functions other than usual least squares, since it can estimate the in sample bias for any function. Examples are given for forecasting chaotic time series contaminated with noise.
To our knowledge, this item is not available for
download. To find whether it is available, there are three
1. Check below under "Related research" whether another version of this item is available online.
2. Check on the provider's web page whether it is in fact available.
3. Perform a search for a similarly titled item that would be available.
|Date of creation:||1997|
|Date of revision:|
|Contact details of provider:|| Postal: |
When requesting a correction, please mention this item's handle: RePEc:att:wimass:9718. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Ailsenne Sumwalt)
If references are entirely missing, you can add them using this form.