Selection of components and degrees of smoothing via lasso in high dimensional nonparametric additive models
AbstractThis paper proposes a procedure for selecting components and degrees of smoothing in high dimensional nonparametric additive models. In the procedure, different components have different penalties, and all the smoothing parameters in one component have the same penalties. The idea is similar to, but in fact different from, Wang etÂ al.'s [Wang, H., Li, G.D., Tsai, C.L., 2007. Regression coefficient and autoregressive order shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B 69, 63-78] modified lasso, which requires different penalties for different parameters. The procedure obtains the sequence of components according to the importance of these components by Efron etÂ al.'s [Efron, B., Hastie, T., Johnstone, I., Tibshirani, R., 2004. Least angle regression. Annals of Statistics 32, 407-489] LARS. CV or BIC selector can be used to select the tuning parameters in the procedure, where some asymptotic properties are proved. Some simulation results and two examples are used to illustrate the procedure.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Bibliographic InfoArticle provided by Elsevier in its journal Computational Statistics & Data Analysis.
Volume (Year): 53 (2008)
Issue (Month): 1 (September)
Contact details of provider:
Web page: http://www.elsevier.com/locate/csda
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Simon N. Wood, 2004. "Stable and Efficient Multiple Smoothing Parameter Estimation for Generalized Additive Models," Journal of the American Statistical Association, American Statistical Association, vol. 99, pages 673-686, January.
- Ferraty, F. & Vieu, P., 2003. "Curves discrimination: a nonparametric functional approach," Computational Statistics & Data Analysis, Elsevier, vol. 44(1-2), pages 161-173, October.
- Hansheng Wang & Guodong Li & Chih-Ling Tsai, 2007. "Regression coefficient and autoregressive order shrinkage and selection via the lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(1), pages 63-78.
- de Uña Álvarez, Jacobo & Roca Pardiñas, Javier, 2009. "Additive models in censored regression," Computational Statistics & Data Analysis, Elsevier, vol. 53(9), pages 3490-3501, July.
- Boj, Eva & Delicado, Pedro & Fortiana, Josep, 2010. "Distance-based local linear regression for functional predictors," Computational Statistics & Data Analysis, Elsevier, vol. 54(2), pages 429-437, February.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei).
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
If references are entirely missing, you can add them using this form.
If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.
Please note that corrections may take a couple of weeks to filter through the various RePEc services.