IDEAS home Printed from https://ideas.repec.org/a/eee/stapro/v82y2012i9p1710-1717.html
   My bibliography  Save this article

Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters

Author

Listed:
  • Lee, Sangin
  • Kim, Yongdai
  • Kwon, Sunghoon

Abstract

We propose an approximated penalized estimator (APE) that covers various statistical models and nonconvex penalties including the smoothly clipped absolute deviation (SCAD) penalty (Fan and Li, 2001) as a special case. The APE achieves the oracle property with a diverging number of parameters which extends the results of Kwon et al. (2011). Several numerical studies confirm the theoretical results.

Suggested Citation

  • Lee, Sangin & Kim, Yongdai & Kwon, Sunghoon, 2012. "Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 82(9), pages 1710-1717.
  • Handle: RePEc:eee:stapro:v:82:y:2012:i:9:p:1710-1717
    DOI: 10.1016/j.spl.2012.05.012
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167715212001873
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.spl.2012.05.012?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Kwon, Sunghoon & Choi, Hosik & Kim, Yongdai, 2011. "Quadratic approximation on SCAD penalized estimation," Computational Statistics & Data Analysis, Elsevier, vol. 55(1), pages 421-428, January.
    3. Yongdai Kim & Sunghoon Kwon, 2012. "Global optimality of nonconvex penalized estimators," Biometrika, Biometrika Trust, vol. 99(2), pages 315-325.
    4. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    5. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    6. Lam, Clifford & Fan, Jianqing, 2008. "Profile-kernel likelihood inference with diverging number of parameters," LSE Research Online Documents on Economics 31548, London School of Economics and Political Science, LSE Library.
    7. He, Xuming & Shao, Qi-Man, 2000. "On Parameters of Increasing Dimensions," Journal of Multivariate Analysis, Elsevier, vol. 73(1), pages 120-135, April.
    8. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    9. Leng, Chenlei & Li, Bo, 2010. "Least squares approximation with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 80(3-4), pages 254-261, February.
    10. Kim, Yongdai & Choi, Hosik & Oh, Hee-Seok, 2008. "Smoothly Clipped Absolute Deviation on High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1665-1673.
    11. Wang, Hansheng & Leng, Chenlei, 2007. "Unified LASSO Estimation by Least Squares Approximation," Journal of the American Statistical Association, American Statistical Association, vol. 102, pages 1039-1048, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Leng, Chenlei & Li, Bo, 2010. "Least squares approximation with a diverging number of parameters," Statistics & Probability Letters, Elsevier, vol. 80(3-4), pages 254-261, February.
    2. Kwon, Sunghoon & Lee, Sangin & Kim, Yongdai, 2015. "Moderately clipped LASSO," Computational Statistics & Data Analysis, Elsevier, vol. 92(C), pages 53-67.
    3. Yanxin Wang & Qibin Fan & Li Zhu, 2018. "Variable selection and estimation using a continuous approximation to the $$L_0$$ L 0 penalty," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 70(1), pages 191-214, February.
    4. Sunghoon Kwon & Jeongyoun Ahn & Woncheol Jang & Sangin Lee & Yongdai Kim, 2017. "A doubly sparse approach for group variable selection," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 69(5), pages 997-1025, October.
    5. Lenka Zbonakova & Wolfgang Karl Härdle & Weining Wang, 2016. "Time Varying Quantile Lasso," SFB 649 Discussion Papers SFB649DP2016-047, Sonderforschungsbereich 649, Humboldt University, Berlin, Germany.
    6. Caner, Mehmet & Fan, Qingliang, 2015. "Hybrid generalized empirical likelihood estimators: Instrument selection with adaptive lasso," Journal of Econometrics, Elsevier, vol. 187(1), pages 256-274.
    7. Fei Jin & Lung-fei Lee, 2018. "Lasso Maximum Likelihood Estimation of Parametric Models with Singular Information Matrices," Econometrics, MDPI, vol. 6(1), pages 1-24, February.
    8. Jin, Fei & Lee, Lung-fei, 2018. "Irregular N2SLS and LASSO estimation of the matrix exponential spatial specification model," Journal of Econometrics, Elsevier, vol. 206(2), pages 336-358.
    9. Xiang Zhang & Yichao Wu & Lan Wang & Runze Li, 2016. "Variable selection for support vector machines in moderately high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 53-76, January.
    10. Joel L. Horowitz, 2015. "Variable selection and estimation in high-dimensional models," CeMMAP working papers 35/15, Institute for Fiscal Studies.
    11. Zhaoping Hong & Yuao Hu & Heng Lian, 2013. "Variable selection for high-dimensional varying coefficient partially linear models via nonconcave penalty," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 76(7), pages 887-908, October.
    12. Kean Ming Tan & Lan Wang & Wen‐Xin Zhou, 2022. "High‐dimensional quantile regression: Convolution smoothing and concave regularization," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(1), pages 205-233, February.
    13. Fan, Rui & Lee, Ji Hyung & Shin, Youngki, 2023. "Predictive quantile regression with mixed roots and increasing dimensions: The ALQR approach," Journal of Econometrics, Elsevier, vol. 237(2).
    14. Quynh Van Nong & Chi Tim Ng, 2021. "Clustering of subsample means based on pairwise L1 regularized empirical likelihood," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(1), pages 135-174, February.
    15. Gaorong Li & Liugen Xue & Heng Lian, 2012. "SCAD-penalised generalised additive models with non-polynomial dimensionality," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(3), pages 681-697.
    16. Li, Xinjue & Zboňáková, Lenka & Wang, Weining & Härdle, Wolfgang Karl, 2019. "Combining Penalization and Adaption in High Dimension with Application in Bond Risk Premia Forecasting," IRTG 1792 Discussion Papers 2019-030, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    17. Jeon, Jong-June & Kwon, Sunghoon & Choi, Hosik, 2017. "Homogeneity detection for the high-dimensional generalized linear model," Computational Statistics & Data Analysis, Elsevier, vol. 114(C), pages 61-74.
    18. Ping Zeng & Yongyue Wei & Yang Zhao & Jin Liu & Liya Liu & Ruyang Zhang & Jianwei Gou & Shuiping Huang & Feng Chen, 2014. "Variable selection approach for zero-inflated count data via adaptive lasso," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(4), pages 879-894, April.
    19. Xia, Xiaochao & Liu, Zhi & Yang, Hu, 2016. "Regularized estimation for the least absolute relative error models with a diverging number of covariates," Computational Statistics & Data Analysis, Elsevier, vol. 96(C), pages 104-119.
    20. Kwon, Sunghoon & Choi, Hosik & Kim, Yongdai, 2011. "Quadratic approximation on SCAD penalized estimation," Computational Statistics & Data Analysis, Elsevier, vol. 55(1), pages 421-428, January.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:stapro:v:82:y:2012:i:9:p:1710-1717. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.