Advanced Search
MyIDEAS: Login to save this article or follow this journal

Shrinkage tuning parameter selection with a diverging number of parameters


Author Info

  • Hansheng Wang
  • Bo Li
  • Chenlei Leng


Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co-worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods. Copyright (c) 2008 Royal Statistical Society.

Download Info

If you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
File URL:
File Function: link to full text
Download Restriction: Access to full text is restricted to subscribers.

As the access to this document is restricted, you may want to look for a different version under "Related research" (further below) or search for a different version of it.

Bibliographic Info

Article provided by Royal Statistical Society in its journal Journal of the Royal Statistical Society: Series B (Statistical Methodology).

Volume (Year): 71 (2009)
Issue (Month): 3 ()
Pages: 671-683

as in new window
Handle: RePEc:bla:jorssb:v:71:y:2009:i:3:p:671-683

Contact details of provider:
Postal: 12 Errol Street, London EC1Y 8LX, United Kingdom
Phone: -44-171-638-8998
Fax: -44-171-256-7598
Web page:
More information through EDIRC

Order Information:

Related research



No references listed on IDEAS
You can help add them by filling out this form.


Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as in new window

Cited by:
  1. Lee, Eun Ryung & Park, Byeong U., 2012. "Sparse estimation in functional linear regression," Journal of Multivariate Analysis, Elsevier, Elsevier, vol. 105(1), pages 1-17.
  2. Leng, Chenlei & Li, Bo, 2010. "Least squares approximation with a diverging number of parameters," Statistics & Probability Letters, Elsevier, Elsevier, vol. 80(3-4), pages 254-261, February.
  3. Li, Gaorong & Xue, Liugen & Lian, Heng, 2011. "Semi-varying coefficient models with a diverging number of components," Journal of Multivariate Analysis, Elsevier, Elsevier, vol. 102(7), pages 1166-1174, August.
  4. Hu, Yuao & Lian, Heng, 2013. "Variable selection in a partially linear proportional hazards model with a diverging dimensionality," Statistics & Probability Letters, Elsevier, Elsevier, vol. 83(1), pages 61-69.
  5. Lian, Heng, 2014. "Semiparametric Bayesian information criterion for model selection in ultra-high dimensional additive models," Journal of Multivariate Analysis, Elsevier, Elsevier, vol. 123(C), pages 304-310.
  6. Lian, Heng & Li, Jianbo & Tang, Xingyu, 2014. "SCAD-penalized regression in additive partially linear proportional hazards models with an ultra-high-dimensional linear part," Journal of Multivariate Analysis, Elsevier, Elsevier, vol. 125(C), pages 50-64.
  7. Wang, Tao & Zhu, Lixing, 2011. "Consistent tuning parameter selection in high dimensional sparse linear regression," Journal of Multivariate Analysis, Elsevier, Elsevier, vol. 102(7), pages 1141-1151, August.
  8. Dengke Xu & Zhongzhan Zhang & Liucang Wu, 2014. "Variable selection in high-dimensional double generalized linear models," Statistical Papers, Springer, Springer, vol. 55(2), pages 327-347, May.
  9. Sakyajit Bhattacharya & Paul McNicholas, 2014. "A LASSO-penalized BIC for mixture model selection," Advances in Data Analysis and Classification, Springer, Springer, vol. 8(1), pages 45-61, March.
  10. Lian, Heng, 2012. "A note on the consistency of Schwarz’s criterion in linear quantile regression with the SCAD penalty," Statistics & Probability Letters, Elsevier, Elsevier, vol. 82(7), pages 1224-1228.
  11. Lee, Sangin & Kim, Yongdai & Kwon, Sunghoon, 2012. "Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters," Statistics & Probability Letters, Elsevier, Elsevier, vol. 82(9), pages 1710-1717.


This item is not listed on Wikipedia, on a reading list or among the top items on IDEAS.


Access and download statistics


When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:71:y:2009:i:3:p:671-683. See general information about how to correct material in RePEc.

For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Wiley-Blackwell Digital Licensing) or (Christopher F. Baum).

If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

If references are entirely missing, you can add them using this form.

If the full references list an item that is present in RePEc, but the system did not link to it, you can help with this form.

If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your profile, as there may be some citations waiting for confirmation.

Please note that corrections may take a couple of weeks to filter through the various RePEc services.