IDEAS home Printed from https://ideas.repec.org/a/bla/jorssb/v71y2009i3p671-683.html
   My bibliography  Save this article

Shrinkage tuning parameter selection with a diverging number of parameters

Author

Listed:
  • Hansheng Wang
  • Bo Li
  • Chenlei Leng

Abstract

Summary. Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g. the lasso and smoothly clipped absolute deviation) are found to be particularly useful for variable selection. Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang and co‐worker have demonstrated that the tuning parameters selected by a Bayesian information criterion type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators. Consequently, our theoretical results further enlarge not only the scope of applicabilityation criterion type criteria but also that of those shrinkage estimation methods.

Suggested Citation

  • Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
  • Handle: RePEc:bla:jorssb:v:71:y:2009:i:3:p:671-683
    DOI: 10.1111/j.1467-9868.2008.00693.x
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/j.1467-9868.2008.00693.x
    Download Restriction: no

    File URL: https://libkey.io/10.1111/j.1467-9868.2008.00693.x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Zhao, Meng & Kulasekera, K.B., 2006. "Consistent linear model selection," Statistics & Probability Letters, Elsevier, vol. 76(5), pages 520-530, March.
    3. Yuhong Yang, 2005. "Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation," Biometrika, Biometrika Trust, vol. 92(4), pages 937-950, December.
    4. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    5. Peide Shi & Chih‐Ling Tsai, 2002. "Regression model selection—a residual likelihood approach," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 64(2), pages 237-252, May.
    6. Hao Helen Zhang & Wenbin Lu, 2007. "Adaptive Lasso for Cox's proportional hazards model," Biometrika, Biometrika Trust, vol. 94(3), pages 691-703.
    7. Hansheng Wang & Guodong Li & Chih‐Ling Tsai, 2007. "Regression coefficient and autoregressive order shrinkage and selection via the lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(1), pages 63-78, February.
    8. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pötscher, Benedikt M., 2007. "Confidence Sets Based on Sparse Estimators Are Necessarily Large," MPRA Paper 5677, University Library of Munich, Germany.
    2. Lenka Zbonakova & Wolfgang Karl Härdle & Weining Wang, 2016. "Time Varying Quantile Lasso," SFB 649 Discussion Papers SFB649DP2016-047, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    3. Na You & Shun He & Xueqin Wang & Junxian Zhu & Heping Zhang, 2018. "Subtype classification and heterogeneous prognosis model construction in precision medicine," Biometrics, The International Biometric Society, vol. 74(3), pages 814-822, September.
    4. Jianqing Fan & Jinchi Lv, 2008. "Sure independence screening for ultrahigh dimensional feature space," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(5), pages 849-911, November.
    5. Alessandro Gregorio & Francesco Iafrate, 2021. "Regularized bridge-type estimation with multiple penalties," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(5), pages 921-951, October.
    6. Wei Wang & Shou‐En Lu & Jerry Q. Cheng & Minge Xie & John B. Kostis, 2022. "Multivariate survival analysis in big data: A divide‐and‐combine approach," Biometrics, The International Biometric Society, vol. 78(3), pages 852-866, September.
    7. Kwon, Sunghoon & Lee, Sangin & Kim, Yongdai, 2015. "Moderately clipped LASSO," Computational Statistics & Data Analysis, Elsevier, vol. 92(C), pages 53-67.
    8. Zbonakova, L. & Härdle, W.K. & Wang, W., 2016. "Time Varying Quantile Lasso," Working Papers 16/07, Department of Economics, City University London.
    9. Xianyi Wu & Xian Zhou, 2019. "On Hodges’ superefficiency and merits of oracle property in model selection," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(5), pages 1093-1119, October.
    10. Leng, Chenlei & Li, Bo, 2010. "Least squares approximation with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 80(3-4), pages 254-261, February.
    11. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    12. Lee, Eun Ryung & Park, Byeong U., 2012. "Sparse estimation in functional linear regression," Journal of Multivariate Analysis, Elsevier, vol. 105(1), pages 1-17.
    13. Arslan, Olcay, 2012. "Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression," Computational Statistics & Data Analysis, Elsevier, vol. 56(6), pages 1952-1965.
    14. Hao, Meiling & Lin, Yunyuan & Zhao, Xingqiu, 2016. "A relative error-based approach for variable selection," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 250-262.
    15. Denis Agniel & Katherine P. Liao & Tianxi Cai, 2016. "Estimation and testing for multiple regulation of multivariate mixed outcomes," Biometrics, The International Biometric Society, vol. 72(4), pages 1194-1205, December.
    16. Stefano Maria IACUS & Alessandro DE GREGORIO, 2010. "Adaptive LASSO-type estimation for ergodic diffusion processes," Departmental Working Papers 2010-13, Department of Economics, Management and Quantitative Methods at Università degli Studi di Milano.
    17. Pötscher, Benedikt M. & Schneider, Ulrike, 2007. "On the distribution of the adaptive LASSO estimator," MPRA Paper 6913, University Library of Munich, Germany.
    18. Li, Jianbo & Gu, Minggao, 2012. "Adaptive LASSO for general transformation models with right censored data," Computational Statistics & Data Analysis, Elsevier, vol. 56(8), pages 2583-2597.
    19. Zhixuan Fu & Chirag R. Parikh & Bingqing Zhou, 2017. "Penalized variable selection in competing risks regression," Lifetime Data Analysis: An International Journal Devoted to Statistical Methods and Applications for Time-to-Event Data, Springer, vol. 23(3), pages 353-376, July.
    20. Hui Xiao & Yiguo Sun, 2020. "Forecasting the Returns of Cryptocurrency: A Model Averaging Approach," JRFM, MDPI, vol. 13(11), pages 1-15, November.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssb:v:71:y:2009:i:3:p:671-683. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.