IDEAS home Printed from https://ideas.repec.org/a/spr/compst/v38y2023i2d10.1007_s00180-022-01249-w.html
   My bibliography  Save this article

A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems

Author

Listed:
  • Peili Li

    (East China Normal University)

  • Min Liu

    (Wuhan University)

  • Zhou Yu

    (East China Normal University)

Abstract

By the asymptotic oracle property, non-convex penalties represented by minimax concave penalty (MCP) and smoothly clipped absolute deviation (SCAD) have attracted much attentions in high-dimensional data analysis, and have been widely used in signal processing, image restoration, matrix estimation, etc. However, in view of their non-convex and non-smooth characteristics, they are computationally challenging. Almost all existing algorithms converge locally, and the proper selection of initial values is crucial. Therefore, in actual operation, they often combine a warm-starting technique to meet the rigid requirement that the initial value must be sufficiently close to the optimal solution of the corresponding problem. In this paper, based on the DC (difference of convex functions) property of MCP and SCAD penalties, we aim to design a global two-stage algorithm for the high-dimensional least squares linear regression problems. A key idea for making the proposed algorithm to be efficient is to use the primal dual active set with continuation (PDASC) method to solve the corresponding sub-problems. Theoretically, we not only prove the global convergence of the proposed algorithm, but also verify that the generated iterative sequence converges to a d-stationary point. In terms of computational performance, the abundant research of simulation and real data show that the algorithm in this paper is superior to the latest SSN method and the classic coordinate descent (CD) algorithm for solving non-convex penalized high-dimensional linear regression problems.

Suggested Citation

  • Peili Li & Min Liu & Zhou Yu, 2023. "A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems," Computational Statistics, Springer, vol. 38(2), pages 871-898, June.
  • Handle: RePEc:spr:compst:v:38:y:2023:i:2:d:10.1007_s00180-022-01249-w
    DOI: 10.1007/s00180-022-01249-w
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s00180-022-01249-w
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s00180-022-01249-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    2. Shan Luo & Zehua Chen, 2014. "Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1229-1240, September.
    3. Mazumder, Rahul & Friedman, Jerome H. & Hastie, Trevor, 2011. "SparseNet: Coordinate Descent With Nonconvex Penalties," Journal of the American Statistical Association, American Statistical Association, vol. 106(495), pages 1125-1138.
    4. Le Thi, H.A. & Pham Dinh, T. & Le, H.M. & Vo, X.T., 2015. "DC approximation approaches for sparse optimization," European Journal of Operational Research, Elsevier, vol. 244(1), pages 26-46.
    5. Jong-Shi Pang & Meisam Razaviyayn & Alberth Alvarado, 2017. "Computing B-Stationary Points of Nonsmooth DC Programs," Mathematics of Operations Research, INFORMS, vol. 42(1), pages 95-118, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Miju Ahn, 2020. "Consistency bounds and support recovery of d-stationary solutions of sparse sample average approximations," Journal of Global Optimization, Springer, vol. 78(3), pages 397-422, November.
    2. Bartosz Uniejewski, 2024. "Regularization for electricity price forecasting," Papers 2404.03968, arXiv.org.
    3. Li, Peili & Jiao, Yuling & Lu, Xiliang & Kang, Lican, 2022. "A data-driven line search rule for support recovery in high-dimensional data analysis," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
    4. Jin, Shaobo & Moustaki, Irini & Yang-Wallentin, Fan, 2018. "Approximated penalized maximum likelihood for exploratory factor analysis: an orthogonal case," LSE Research Online Documents on Economics 88118, London School of Economics and Political Science, LSE Library.
    5. Anda Tang & Pei Quan & Lingfeng Niu & Yong Shi, 2022. "A Survey for Sparse Regularization Based Compression Methods," Annals of Data Science, Springer, vol. 9(4), pages 695-722, August.
    6. Ben-Ameur, Walid & Neto, José, 2022. "New bounds for subset selection from conic relaxations," European Journal of Operational Research, Elsevier, vol. 298(2), pages 425-438.
    7. Hoai An Le Thi & Manh Cuong Nguyen, 2017. "DCA based algorithms for feature selection in multi-class support vector machine," Annals of Operations Research, Springer, vol. 249(1), pages 273-300, February.
    8. Xiang Zhang & Yichao Wu & Lan Wang & Runze Li, 2016. "Variable selection for support vector machines in moderately high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 53-76, January.
    9. Hu, Jianwei & Chai, Hao, 2013. "Adjusted regularized estimation in the accelerated failure time model with high dimensional covariates," Journal of Multivariate Analysis, Elsevier, vol. 122(C), pages 96-114.
    10. Minh Pham & Xiaodong Lin & Andrzej Ruszczyński & Yu Du, 2021. "An outer–inner linearization method for non-convex and nondifferentiable composite regularization problems," Journal of Global Optimization, Springer, vol. 81(1), pages 179-202, September.
    11. Yingying Fan & Jinchi Lv, 2014. "Asymptotic properties for combined L1 and concave regularization," Biometrika, Biometrika Trust, vol. 101(1), pages 57-70.
    12. Siwei Xia & Yuehan Yang & Hu Yang, 2022. "Sparse Laplacian Shrinkage with the Graphical Lasso Estimator for Regression Problems," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(1), pages 255-277, March.
    13. Liu, Wenchen & Tang, Yincai & Wu, Xianyi, 2020. "Separating variables to accelerate non-convex regularized optimization," Computational Statistics & Data Analysis, Elsevier, vol. 147(C).
    14. Gabriel E Hoffman & Benjamin A Logsdon & Jason G Mezey, 2013. "PUMA: A Unified Framework for Penalized Multiple Regression Analysis of GWAS Data," PLOS Computational Biology, Public Library of Science, vol. 9(6), pages 1-19, June.
    15. Hirose, Kei & Tateishi, Shohei & Konishi, Sadanori, 2013. "Tuning parameter selection in sparse regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 59(C), pages 28-40.
    16. Dongdong Zhang & Shaohua Pan & Shujun Bi & Defeng Sun, 2023. "Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound," Computational Optimization and Applications, Springer, vol. 86(2), pages 627-667, November.
    17. Yawei He & Zehua Chen, 2016. "The EBIC and a sequential procedure for feature selection in interactive linear models with high-dimensional data," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 68(1), pages 155-180, February.
    18. Yuta Umezu & Yusuke Shimizu & Hiroki Masuda & Yoshiyuki Ninomiya, 2019. "AIC for the non-concave penalized likelihood method," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(2), pages 247-274, April.
    19. Yen, Yu-Min & Yen, Tso-Jung, 2014. "Solving norm constrained portfolio optimization via coordinate-wise descent algorithms," Computational Statistics & Data Analysis, Elsevier, vol. 76(C), pages 737-759.
    20. Xian Zhang & Dingtao Peng, 2022. "Solving constrained nonsmooth group sparse optimization via group Capped- $$\ell _1$$ ℓ 1 relaxation and group smoothing proximal gradient algorithm," Computational Optimization and Applications, Springer, vol. 83(3), pages 801-844, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:compst:v:38:y:2023:i:2:d:10.1007_s00180-022-01249-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.