Practical variable selection for generalized additive models
The problem of variable selection within the class of generalized additive models, when there are many covariates to choose from but the number of predictors is still somewhat smaller than the number of observations, is considered. Two very simple but effective shrinkage methods and an extension of the nonnegative garrote estimator are introduced. The proposals avoid having to use nonparametric testing methods for which there is no general reliable distributional theory. Moreover, component selection is carried out in one single step as opposed to many selection procedures which involve an exhaustive search of all possible models. The empirical performance of the proposed methods is compared to that of some available techniques via an extensive simulation study. The results show under which conditions one method can be preferred over another, hence providing applied researchers with some practical guidelines. The procedures are also illustrated analysing data on plasma beta-carotene levels from a cross-sectional study conducted in the United States.
If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Simila, Timo & Tikka, Jarkko, 2007. "Input selection and shrinkage in multiresponse linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 406-422, September.
- Eva Cantoni, 2002. "Degrees-of-freedom tests for smoothing splines," Biometrika, Biometrika Trust, vol. 89(2), pages 251-263, June.
- Philip T. Reiss & R. Todd Ogden, 2009. "Smoothing parameter selection for a class of semiparametric linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(2), pages 505-523.
- Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
- Tutz, Gerhard & Binder, Harald, 2007. "Boosting ridge regression," Computational Statistics & Data Analysis, Elsevier, vol. 51(12), pages 6044-6059, August.
- Belitz, Christiane & Lang, Stefan, 2008. "Simultaneous selection of variables and smoothing parameters in structured additive regression models," Computational Statistics & Data Analysis, Elsevier, vol. 53(1), pages 61-81, September.
- Göran Kauermann & Gerhard Tutz, 2001. "Testing generalized linear and semiparametric models against smooth alternatives," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 63(1), pages 147-166.
- Simon N. Wood, 2008. "Fast stable direct fitting and smoothness selection for generalized additive models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(3), pages 495-518.
- Daye, Z. John & Jeng, X. Jessie, 2009. "Shrinkage and model selection with correlated variables via weighted fusion," Computational Statistics & Data Analysis, Elsevier, vol. 53(4), pages 1284-1298, February.
- Simon N. Wood, 2003. "Thin plate regression splines," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 65(1), pages 95-114.
- Buhlmann P. & Yu B., 2003. "Boosting With the L2 Loss: Regression and Classification," Journal of the American Statistical Association, American Statistical Association, vol. 98, pages 324-339, January.
- Simon N. Wood, 2011. "Fast stable restricted maximum likelihood and marginal likelihood estimation of semiparametric generalized linear models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 73(1), pages 3-36, January.
- Avalos, Marta & Grandvalet, Yves & Ambroise, Christophe, 2007. "Parsimonious additive models," Computational Statistics & Data Analysis, Elsevier, vol. 51(6), pages 2851-2870, March.
- Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67.
- Shafik, Nivien & Tutz, Gerhard, 2009. "Boosting nonlinear additive autoregressive time series," Computational Statistics & Data Analysis, Elsevier, vol. 53(7), pages 2453-2464, May.
- Nott, David J. & Leng, Chenlei, 2010. "Bayesian projection approaches to variable selection in generalized linear models," Computational Statistics & Data Analysis, Elsevier, vol. 54(12), pages 3227-3241, December.
When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:55:y:2011:i:7:p:2372-2387. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei)
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If references are entirely missing, you can add them using this form.
If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.
Please note that corrections may take a couple of weeks to filter through the various RePEc services.