Shrinkage tuning parameter selection with a diverging number of parameters
AbstractContemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods. Copyright (c) 2008 Royal Statistical Society.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.
Bibliographic InfoArticle provided by Royal Statistical Society in its journal Journal of the Royal Statistical Society: Series B (Statistical Methodology).
Volume (Year): 71 (2009)
Issue (Month): 3 ()
Contact details of provider:
Postal: 12 Errol Street, London EC1Y 8LX, United Kingdom
Web page: http://www.blackwellpublishing.com/journal.asp?ref=1369-7412
More information through EDIRC
You can help add them by filling out this form.
CitEc Project, subscribe to its RSS feed for this item.
- Hu, Yuao & Lian, Heng, 2013. "Variable selection in a partially linear proportional hazards model with a diverging dimensionality," Statistics & Probability Letters, Elsevier, vol. 83(1), pages 61-69.
- Leng, Chenlei & Li, Bo, 2010. "Least squares approximation with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 80(3-4), pages 254-261, February.
- Lee, Sangin & Kim, Yongdai & Kwon, Sunghoon, 2012. "Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 82(9), pages 1710-1717.
- Li, Gaorong & Xue, Liugen & Lian, Heng, 2011. "Semi-varying coefficient models with a diverging number of components," Journal of Multivariate Analysis, Elsevier, vol. 102(7), pages 1166-1174, August.
- Lee, Eun Ryung & Park, Byeong U., 2012. "Sparse estimation in functional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 1-17.
- Lian, Heng, 2012. "A note on the consistency of Schwarz’s criterion in linear quantile regression with the SCAD penalty," Statistics & Probability Letters, Elsevier, vol. 82(7), pages 1224-1228.
- Wang, Tao & Zhu, Lixing, 2011. "Consistent tuning parameter selection in high dimensional sparse linear regression," Journal of Multivariate Analysis, Elsevier, vol. 102(7), pages 1141-1151, August.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley-Blackwell Digital Licensing) or (Christopher F. Baum).
If references are entirely missing, you can add them using this form.