IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v185y2021ics0047259x21000609.html
   My bibliography  Save this article

ℓ2,0-norm based selection and estimation for multivariate generalized linear models

Author

Listed:
  • Chen, Yang
  • Luo, Ziyan
  • Kong, Lingchen

Abstract

Group sparse regression has been well considered in multivariate linear models with appropriate relaxation schemes for the involved ℓ2,0-norm penalty. Lacking of the extended research on multivariate generalized linear models (GLMs), this paper targets at the original discontinuous and nonconvex ℓ2,0-norm based selection and estimation for multivariate GLMs. Under mild conditions, we give a necessary condition for selection consistency based on the notion of degree of separation, and propose the feature selection consistency as well as optimal coefficient estimation for the resulting ℓ2,0-likelihood estimators in terms of the Hellinger risk. Numerical studies on synthetic data and a real data in chemometrics confirm superior performance of the ℓ2,0-likelihood methods.

Suggested Citation

  • Chen, Yang & Luo, Ziyan & Kong, Lingchen, 2021. "ℓ2,0-norm based selection and estimation for multivariate generalized linear models," Journal of Multivariate Analysis, Elsevier, vol. 185(C).
  • Handle: RePEc:eee:jmvana:v:185:y:2021:i:c:s0047259x21000609
    DOI: 10.1016/j.jmva.2021.104782
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X21000609
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2021.104782?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    2. Lichun Wang & Yuan You & Heng Lian, 2015. "Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models," Statistical Papers, Springer, vol. 56(3), pages 819-828, August.
    3. Xiaoguang Wang & Junhui Fan, 2014. "Variable selection for multivariate generalized linear models," Journal of Applied Statistics, Taylor & Francis Journals, vol. 41(2), pages 393-406, February.
    4. Yanming Li & Bin Nan & Ji Zhu, 2015. "Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure," Biometrics, The International Biometric Society, vol. 71(2), pages 354-363, June.
    5. Kim, Yongdai & Choi, Hosik & Oh, Hee-Seok, 2008. "Smoothly Clipped Absolute Deviation on High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1665-1673.
    6. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    7. Rui Wang & Naihua Xiu & Kim-Chuan Toh, 2021. "Subspace quadratic regularization method for group sparse multinomial logistic regression," Computational Optimization and Applications, Springer, vol. 79(3), pages 531-559, July.
    8. Bae, S. & Famoye, F. & Wulu, J.T. & Bartolucci, A.A. & Singh, K.P., 2005. "A rich family of generalized Poisson regression models with applications," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 69(1), pages 4-11.
    9. Xiaotong Shen & Wei Pan & Yunzhang Zhu & Hui Zhou, 2013. "On constrained and regularized high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(5), pages 807-832, October.
    10. Hu, Jianhua & Xin, Xin & You, Jinhong, 2014. "Model determination and estimation for the growth curve model via group SCAD penalty," Journal of Multivariate Analysis, Elsevier, vol. 124(C), pages 199-213.
    11. Xiaotong Shen & Wei Pan & Yunzhang Zhu, 2012. "Likelihood-Based Selection and Sharp Parameter Estimation," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 223-232, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xiaotong Shen & Wei Pan & Yunzhang Zhu & Hui Zhou, 2013. "On constrained and regularized high-dimensional regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 65(5), pages 807-832, October.
    2. Canhong Wen & Xueqin Wang & Shaoli Wang, 2015. "Laplace Error Penalty-based Variable Selection in High Dimension," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 42(3), pages 685-700, September.
    3. Dai, Linlin & Chen, Kani & Sun, Zhihua & Liu, Zhenqiu & Li, Gang, 2018. "Broken adaptive ridge regression and its asymptotic properties," Journal of Multivariate Analysis, Elsevier, vol. 168(C), pages 334-351.
    4. Kwon, Sunghoon & Oh, Seungyoung & Lee, Youngjo, 2016. "The use of random-effect models for high-dimensional variable selection problems," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 401-412.
    5. Mike K. P. So & Wing Ki Liu & Amanda M. Y. Chu, 2018. "Bayesian Shrinkage Estimation Of Time-Varying Covariance Matrices In Financial Time Series," Advances in Decision Sciences, Asia University, Taiwan, vol. 22(1), pages 369-404, December.
    6. Shanshan Qin & Hao Ding & Yuehua Wu & Feng Liu, 2021. "High-dimensional sign-constrained feature selection and grouping," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 73(4), pages 787-819, August.
    7. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    8. Gaorong Li & Liugen Xue & Heng Lian, 2012. "SCAD-penalised generalised additive models with non-polynomial dimensionality," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 24(3), pages 681-697.
    9. Shan Luo & Zehua Chen, 2014. "Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 109(507), pages 1229-1240, September.
    10. Loann David Denis Desboulets, 2018. "A Review on Variable Selection in Regression Analysis," Econometrics, MDPI, vol. 6(4), pages 1-27, November.
    11. Li, Xinjue & Zboňáková, Lenka & Wang, Weining & Härdle, Wolfgang Karl, 2019. "Combining Penalization and Adaption in High Dimension with Application in Bond Risk Premia Forecasting," IRTG 1792 Discussion Papers 2019-030, Humboldt University of Berlin, International Research Training Group 1792 "High Dimensional Nonstationary Time Series".
    12. Lan Wang & Yichao Wu & Runze Li, 2012. "Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(497), pages 214-222, March.
    13. Sokbae Lee & Myung Hwan Seo & Youngki Shin, 2016. "The lasso for high dimensional regression with a possible change point," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 193-210, January.
    14. Lian, Heng & Li, Jianbo & Hu, Yuao, 2013. "Shrinkage variable selection and estimation in proportional hazards models with additive structure and high dimensionality," Computational Statistics & Data Analysis, Elsevier, vol. 63(C), pages 99-112.
    15. Wentao Wang & Jiaxuan Liang & Rong Liu & Yunquan Song & Min Zhang, 2022. "A Robust Variable Selection Method for Sparse Online Regression via the Elastic Net Penalty," Mathematics, MDPI, vol. 10(16), pages 1-18, August.
    16. Fan, Zengyan & Lian, Heng, 2018. "Quantile regression for additive coefficient models in high dimensions," Journal of Multivariate Analysis, Elsevier, vol. 164(C), pages 54-64.
    17. Xia, Siwei & Yang, Yuehan & Yang, Hu, 2023. "High-dimensional sparse portfolio selection with nonnegative constraint," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    18. Ricardo P. Masini & Marcelo C. Medeiros & Eduardo F. Mendes, 2023. "Machine learning advances for time series forecasting," Journal of Economic Surveys, Wiley Blackwell, vol. 37(1), pages 76-111, February.
    19. Lina Liao & Cheolwoo Park & Hosik Choi, 2019. "Penalized expectile regression: an alternative to penalized quantile regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 71(2), pages 409-438, April.
    20. Jeon, Jong-June & Kwon, Sunghoon & Choi, Hosik, 2017. "Homogeneity detection for the high-dimensional generalized linear model," Computational Statistics & Data Analysis, Elsevier, vol. 114(C), pages 61-74.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:185:y:2021:i:c:s0047259x21000609. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.