IDEAS home Printed from https://ideas.repec.org/a/inm/orijoc/v34y2022i5p2762-2775.html
   My bibliography  Save this article

L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression

Author

Listed:
  • Zemin Zheng

    (International Institute of Finance, The School of Management, University of Science and Technology of China, Hefei, Anhui 230026, P. R. China)

  • Jie Zhang

    (International Institute of Finance, The School of Management, University of Science and Technology of China, Hefei, Anhui 230026, P. R. China)

  • Yang Li

    (International Institute of Finance, The School of Management, University of Science and Technology of China, Hefei, Anhui 230026, P. R. China)

Abstract

Sparse learning in high-dimensional survival analysis is of great practical importance, as exemplified by modern applications in credit risk analysis and high-throughput genomic data analysis. In this article, we consider the L 0 -regularized learning for simultaneous variable selection and estimation under the framework of additive hazards models and utilize the idea of primal dual active sets to develop an algorithm targeted at solving the traditionally nonpolynomial time optimization problem. Under interpretable conditions, comprehensive statistical properties, including model selection consistency, oracle inequalities under various estimation losses, and the oracle property, are established for the global optimizer of the proposed approach. Moreover, our theoretical analysis for the algorithmic solution reveals that the proposed L 0 -regularized learning can be more efficient than other regularization methods in that it requests a smaller sample size as well as a lower minimum signal strength to identify the significant features. The effectiveness of the proposed method is evidenced by simulation studies and real-data analysis. Summary of Contribution: Feature selection is a fundamental statistical learning technique under high dimensions and routinely encountered in various areas, including operations research and computing. This paper focuses on the L 0 -regularized learning for feature selection in high-dimensional additive hazards regression. The matching algorithm for solving the nonconvex L 0 -constrained problem is scalable and enjoys comprehensive theoretical properties.

Suggested Citation

  • Zemin Zheng & Jie Zhang & Yang Li, 2022. "L 0 -Regularized Learning for High-Dimensional Additive Hazards Regression," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2762-2775, September.
  • Handle: RePEc:inm:orijoc:v:34:y:2022:i:5:p:2762-2775
    DOI: 10.1287/ijoc.2022.1208
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/ijoc.2022.1208
    Download Restriction: no

    File URL: https://libkey.io/10.1287/ijoc.2022.1208?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Jianqing Fan & Shaojun Guo & Ning Hao, 2012. "Variance estimation using refitted cross‐validation in ultrahigh dimensional regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 74(1), pages 37-65, January.
    2. Daehan Won & Hasan Manzour & Wanpracha Chaovalitwongse, 2020. "Convex Optimization for Group Feature Selection in Networked Data," INFORMS Journal on Computing, INFORMS, vol. 32(1), pages 182-198, January.
    3. Jinfeng Xu & Wai Keung Li & Zhiliang Ying, 2020. "Variable screening for survival data in the presence of heterogeneous censoring," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 47(4), pages 1171-1191, December.
    4. Simon, Noah & Friedman, Jerome H. & Hastie, Trevor & Tibshirani, Rob, 2011. "Regularization Paths for Cox's Proportional Hazards Model via Coordinate Descent," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 39(i05).
    5. Shan Luo & Jinfeng Xu & Zehua Chen, 2015. "Extended Bayesian information criterion in the Cox model with a high-dimensional feature space," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 67(2), pages 287-311, April.
    6. Mengjie Chen & Zhao Ren & Hongyu Zhao & Harrison Zhou, 2016. "Asymptotically Normal and Efficient Estimation of Covariate-Adjusted Gaussian Graphical Model," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(513), pages 394-406, March.
    7. Zemin Zheng & Jinchi Lv & Wei Lin, 2021. "Nonsparse Learning with Latent Variables," Operations Research, INFORMS, vol. 69(1), pages 346-359, January.
    8. Laura J. van 't Veer & Hongyue Dai & Marc J. van de Vijver & Yudong D. He & Augustinus A. M. Hart & Mao Mao & Hans L. Peterse & Karin van der Kooy & Matthew J. Marton & Anke T. Witteveen & George J. S, 2002. "Gene expression profiling predicts clinical outcome of breast cancer," Nature, Nature, vol. 415(6871), pages 530-536, January.
    9. Ishwaran, Hemant & Kogalur, Udaya B. & Gorodeski, Eiran Z. & Minn, Andy J. & Lauer, Michael S., 2010. "High-Dimensional Variable Selection for Survival Data," Journal of the American Statistical Association, American Statistical Association, vol. 105(489), pages 205-217.
    10. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    11. Zemin Zheng & Yingying Fan & Jinchi Lv, 2014. "High dimensional thresholded regression and shrinkage effect," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(3), pages 627-649, June.
    12. Wei Lin & Jinchi Lv, 2013. "High-Dimensional Sparse Additive Hazards Regression," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(501), pages 247-264, March.
    13. Torben Martinussen & Thomas H. Scheike, 2009. "Covariate Selection for the Semiparametric Additive Risk Model," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 36(4), pages 602-619, December.
    14. A. Belloni & V. Chernozhukov & L. Wang, 2011. "Square-root lasso: pivotal recovery of sparse signals via conic programming," Biometrika, Biometrika Trust, vol. 98(4), pages 791-806.
    15. Huan Xu & Constantine Caramanis & Shie Mannor, 2016. "Statistical Optimization in High Dimensions," Operations Research, INFORMS, vol. 64(4), pages 958-979, August.
    16. Tingni Sun & Cun-Hui Zhang, 2012. "Scaled sparse linear regression," Biometrika, Biometrika Trust, vol. 99(4), pages 879-898.
    17. A. Belloni & V. Chernozhukov & I. Fernández‐Val & C. Hansen, 2017. "Program Evaluation and Causal Inference With High‐Dimensional Data," Econometrica, Econometric Society, vol. 85, pages 233-298, January.
    18. Xuefei Lu & Alessandro Rudi & Emanuele Borgonovo & Lorenzo Rosasco, 2020. "Faster Kriging: Facing High-Dimensional Simulators," Operations Research, INFORMS, vol. 68(1), pages 233-249, January.
    19. Jianqing Fan & Jinchi Lv & Lei Qi, 2011. "Sparse High-Dimensional Models in Economics," Annual Review of Economics, Annual Reviews, vol. 3(1), pages 291-317, September.
    20. Yingying Fan & Cheng Yong Tang, 2013. "Tuning parameter selection in high dimensional penalized likelihood," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 75(3), pages 531-552, June.
    21. Hui Zou & Trevor Hastie, 2005. "Addendum: Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(5), pages 768-768, November.
    22. Cheng Yong Tang & Chenlei Leng, 2010. "Penalized high-dimensional empirical likelihood," Biometrika, Biometrika Trust, vol. 97(4), pages 905-920.
    23. Xiaoye Cheng & Jingjing Zhang & Lu (Lucy) Yan, 2020. "Understanding the Impact of Individual Users’ Rating Characteristics on the Predictive Accuracy of Recommender Systems," INFORMS Journal on Computing, INFORMS, vol. 32(2), pages 303-320, April.
    24. Hui Zou & Trevor Hastie, 2005. "Regularization and variable selection via the elastic net," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 67(2), pages 301-320, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Zemin Zheng & Jinchi Lv & Wei Lin, 2021. "Nonsparse Learning with Latent Variables," Operations Research, INFORMS, vol. 69(1), pages 346-359, January.
    2. Mehmet Caner & Anders Bredahl Kock, 2016. "Oracle Inequalities for Convex Loss Functions with Nonlinear Targets," Econometric Reviews, Taylor & Francis Journals, vol. 35(8-10), pages 1377-1411, December.
    3. Alexander Chudik & George Kapetanios & M. Hashem Pesaran, 2016. "Big Data Analytics: A New Perspective," CESifo Working Paper Series 5824, CESifo.
    4. Umberto Amato & Anestis Antoniadis & Italia De Feis & Irene Gijbels, 2021. "Penalised robust estimators for sparse and high-dimensional linear models," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 30(1), pages 1-48, March.
    5. Victor Chernozhukov & Christian Hansen & Yuan Liao, 2015. "A lava attack on the recovery of sums of dense and sparse signals," CeMMAP working papers CWP56/15, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    6. A. Chudik & G. Kapetanios & M. Hashem Pesaran, 2018. "A One Covariate at a Time, Multiple Testing Approach to Variable Selection in High‐Dimensional Linear Regression Models," Econometrica, Econometric Society, vol. 86(4), pages 1479-1512, July.
    7. Zheng, Zemin & Li, Yang & Yu, Chongxiu & Li, Gaorong, 2018. "Balanced estimation for high-dimensional measurement error models," Computational Statistics & Data Analysis, Elsevier, vol. 126(C), pages 78-91.
    8. Sermpinis, Georgios & Tsoukas, Serafeim & Zhang, Ping, 2018. "Modelling market implied ratings using LASSO variable selection techniques," Journal of Empirical Finance, Elsevier, vol. 48(C), pages 19-35.
    9. Laura Freijeiro‐González & Manuel Febrero‐Bande & Wenceslao González‐Manteiga, 2022. "A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates," International Statistical Review, International Statistical Institute, vol. 90(1), pages 118-145, April.
    10. Adel Javanmard & Jason D. Lee, 2020. "A flexible framework for hypothesis testing in high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 82(3), pages 685-718, July.
    11. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    12. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    13. Soave, David & Lawless, Jerald F., 2023. "Regularized regression for two phase failure time studies," Computational Statistics & Data Analysis, Elsevier, vol. 182(C).
    14. Peter Bühlmann & Jacopo Mandozzi, 2014. "High-dimensional variable screening and bias in subsequent inference, with an empirical comparison," Computational Statistics, Springer, vol. 29(3), pages 407-430, June.
    15. Takumi Saegusa & Tianzhou Ma & Gang Li & Ying Qing Chen & Mei-Ling Ting Lee, 2020. "Variable Selection in Threshold Regression Model with Applications to HIV Drug Adherence Data," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 12(3), pages 376-398, December.
    16. Zanhua Yin, 2020. "Variable selection for sparse logistic regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 83(7), pages 821-836, October.
    17. Cui, Hailong & Rajagopalan, Sampath & Ward, Amy R., 2020. "Predicting product return volume using machine learning methods," European Journal of Operational Research, Elsevier, vol. 281(3), pages 612-627.
    18. Xing, Li-Min & Zhang, Yue-Jun, 2022. "Forecasting crude oil prices with shrinkage methods: Can nonconvex penalty and Huber loss help?," Energy Economics, Elsevier, vol. 110(C).
    19. Lasanthi C. R. Pelawa Watagoda & David J. Olive, 2021. "Comparing six shrinkage estimators with large sample theory and asymptotically optimal prediction intervals," Statistical Papers, Springer, vol. 62(5), pages 2407-2431, October.
    20. Xiaofei Wu & Rongmei Liang & Hu Yang, 2022. "Penalized and constrained LAD estimation in fixed and high dimension," Statistical Papers, Springer, vol. 63(1), pages 53-95, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orijoc:v:34:y:2022:i:5:p:2762-2775. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.