IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v195y2023ics0047259x23000052.html
   My bibliography  Save this article

Envelope-based sparse reduced-rank regression for multivariate linear model

Author

Listed:
  • Guo, Wenxing
  • Balakrishnan, Narayanaswamy
  • He, Mu

Abstract

Envelope models were first proposed by Cook et al. (2010) as a method to reduce estimative and predictive variations in multivariate regression. Sparse reduced-rank regression, introduced by Chen and Huang (2012), is a widely used technique that performs dimension reduction and variable selection simultaneously in multivariate regression. In this work, we combine envelope models and sparse reduced-rank regression method to propose an envelope-based sparse reduced-rank regression estimator, and then establish its consistency, asymptotic normality and oracle property in high-dimensional data. We carry out some Monte Carlo simulation studies and also analyze two datasets to demonstrate that the proposed envelope-based sparse reduced-rank regression method displays good variable selection and prediction performance.

Suggested Citation

  • Guo, Wenxing & Balakrishnan, Narayanaswamy & He, Mu, 2023. "Envelope-based sparse reduced-rank regression for multivariate linear model," Journal of Multivariate Analysis, Elsevier, vol. 195(C).
  • Handle: RePEc:eee:jmvana:v:195:y:2023:i:c:s0047259x23000052
    DOI: 10.1016/j.jmva.2023.105159
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X23000052
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2023.105159?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ledoit, Olivier & Wolf, Michael, 2004. "A well-conditioned estimator for large-dimensional covariance matrices," Journal of Multivariate Analysis, Elsevier, vol. 88(2), pages 365-411, February.
    2. Adam J. Rothman, 2012. "Positive definite estimators of large covariance matrices," Biometrika, Biometrika Trust, vol. 99(3), pages 733-740.
    3. R. Dennis Cook & Liliana Forzani & Xin Zhang, 2015. "Envelopes and reduced-rank regression," Biometrika, Biometrika Trust, vol. 102(2), pages 439-456.
    4. Z. Su & G. Zhu & X. Chen & Y. Yang, 2016. "Sparse envelope model: efficient estimation and response variable selection in multivariate linear regression," Biometrika, Biometrika Trust, vol. 103(3), pages 579-593.
    5. Hyonho Chun & Sündüz Keleş, 2010. "Sparse partial least squares regression for simultaneous dimension reduction and variable selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(1), pages 3-25, January.
    6. Cook, R. Dennis & Forzani, Liliana & Su, Zhihua, 2016. "A note on fast envelope estimation," Journal of Multivariate Analysis, Elsevier, vol. 150(C), pages 42-54.
    7. Lian, Heng & Kim, Yongdai, 2016. "Nonconvex penalized reduced rank regression and its oracle properties in high dimensions," Journal of Multivariate Analysis, Elsevier, vol. 143(C), pages 383-393.
    8. Kim, Yongdai & Choi, Hosik & Oh, Hee-Seok, 2008. "Smoothly Clipped Absolute Deviation on High Dimensions," Journal of the American Statistical Association, American Statistical Association, vol. 103(484), pages 1665-1673.
    9. Guo, Xiao & Zhang, Hai & Wang, Yao & Wu, Jiang-Lun, 2015. "Model selection and estimation in high dimensional regression models with group SCAD," Statistics & Probability Letters, Elsevier, vol. 103(C), pages 86-92.
    10. Kun Chen & Kung‐Sik Chan & Nils Chr. Stenseth, 2012. "Reduced rank stochastic regression with a sparse singular value decomposition," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 74(2), pages 203-221, March.
    11. Kun Chen & Hongbo Dong & Kung-Sik Chan, 2013. "Reduced rank regression via adaptive nuclear norm penalization," Biometrika, Biometrika Trust, vol. 100(4), pages 901-920.
    12. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    13. Lingzhou Xue & Shiqian Ma & Hui Zou, 2012. "Positive-Definite ℓ 1 -Penalized Estimation of Large Covariance Matrices," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1480-1491, December.
    14. Lisha Chen & Jianhua Z. Huang, 2012. "Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1533-1545, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Luo, Chongliang & Liang, Jian & Li, Gen & Wang, Fei & Zhang, Changshui & Dey, Dipak K. & Chen, Kun, 2018. "Leveraging mixed and incomplete outcomes via reduced-rank modeling," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 378-394.
    2. Goh, Gyuhyeong & Dey, Dipak K. & Chen, Kun, 2017. "Bayesian sparse reduced rank multivariate regression," Journal of Multivariate Analysis, Elsevier, vol. 157(C), pages 14-28.
    3. Minji Lee & Zhihua Su, 2020. "A Review of Envelope Models," International Statistical Review, International Statistical Institute, vol. 88(3), pages 658-676, December.
    4. Dmitry Kobak & Yves Bernaerts & Marissa A. Weis & Federico Scala & Andreas S. Tolias & Philipp Berens, 2021. "Sparse reduced‐rank regression for exploratory visualisation of paired multivariate data," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(4), pages 980-1000, August.
    5. Hu, Jianhua & Liu, Xiaoqian & Liu, Xu & Xia, Ningning, 2022. "Some aspects of response variable selection and estimation in multivariate linear regression," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    6. Lian, Heng & Kim, Yongdai, 2016. "Nonconvex penalized reduced rank regression and its oracle properties in high dimensions," Journal of Multivariate Analysis, Elsevier, vol. 143(C), pages 383-393.
    7. Luo, Ruiyan & Qi, Xin, 2017. "Signal extraction approach for sparse multivariate response regression," Journal of Multivariate Analysis, Elsevier, vol. 153(C), pages 83-97.
    8. Bailey, Natalia & Pesaran, M. Hashem & Smith, L. Vanessa, 2019. "A multiple testing approach to the regularisation of large sample correlation matrices," Journal of Econometrics, Elsevier, vol. 208(2), pages 507-534.
    9. Mike K. P. So & Wing Ki Liu & Amanda M. Y. Chu, 2018. "Bayesian Shrinkage Estimation Of Time-Varying Covariance Matrices In Financial Time Series," Advances in Decision Sciences, Asia University, Taiwan, vol. 22(1), pages 369-404, December.
    10. Ahelegbey, Daniel Felix, 2015. "The Econometrics of Bayesian Graphical Models: A Review With Financial Application," MPRA Paper 92634, University Library of Munich, Germany, revised 25 Apr 2016.
    11. Alvaro Mendez-Civieta & M. Carmen Aguilera-Morillo & Rosa E. Lillo, 2021. "Adaptive sparse group LASSO in quantile regression," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 547-573, September.
    12. Yu, Philip L.H. & Wang, Xiaohang & Zhu, Yuanyuan, 2017. "High dimensional covariance matrix estimation by penalizing the matrix-logarithm transformed likelihood," Computational Statistics & Data Analysis, Elsevier, vol. 114(C), pages 12-25.
    13. Vahe Avagyan & Andrés M. Alonso & Francisco J. Nogales, 2018. "D-trace estimation of a precision matrix using adaptive Lasso penalties," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 12(2), pages 425-447, June.
    14. Ding, Yi & Li, Yingying & Zheng, Xinghua, 2021. "High dimensional minimum variance portfolio estimation under statistical factor models," Journal of Econometrics, Elsevier, vol. 222(1), pages 502-515.
    15. Li, Peili & Xiao, Yunhai, 2018. "An efficient algorithm for sparse inverse covariance matrix estimation based on dual formulation," Computational Statistics & Data Analysis, Elsevier, vol. 128(C), pages 292-307.
    16. Avagyan, Vahe & Alonso Fernández, Andrés Modesto & Nogales, Francisco J., 2015. "D-trace Precision Matrix Estimation Using Adaptive Lasso Penalties," DES - Working Papers. Statistics and Econometrics. WS 21775, Universidad Carlos III de Madrid. Departamento de Estadística.
    17. Yan Zhang & Jiyuan Tao & Zhixiang Yin & Guoqiang Wang, 2022. "Improved Large Covariance Matrix Estimation Based on Efficient Convex Combination and Its Application in Portfolio Optimization," Mathematics, MDPI, vol. 10(22), pages 1-15, November.
    18. Feng, Sanying & Lian, Heng & Zhu, Fukang, 2016. "Reduced rank regression with possibly non-smooth criterion functions: An empirical likelihood approach," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 139-150.
    19. Yue Zhao & Ingrid Van Keilegom & Shanshan Ding, 2022. "Envelopes for censored quantile regression," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 49(4), pages 1562-1585, December.
    20. Mishra, Aditya & Dey, Dipak K. & Chen, Yong & Chen, Kun, 2021. "Generalized co-sparse factor regression," Computational Statistics & Data Analysis, Elsevier, vol. 157(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:195:y:2023:i:c:s0047259x23000052. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.