IDEAS home Printed from https://ideas.repec.org/a/eee/jmvana/v153y2017icp83-97.html
   My bibliography  Save this article

Signal extraction approach for sparse multivariate response regression

Author

Listed:
  • Luo, Ruiyan
  • Qi, Xin

Abstract

In this paper, we consider multivariate response regression models with high dimensional predictor variables. One way to estimate the coefficient matrix is through its decomposition. Among various decomposition of the coefficient matrix, we focus on the decomposition which leads to the best approximation to the signal part in the response vector given any rank. Finding this decomposition is equivalent to performing a principal component analysis for the signal. Given any rank, this decomposition has nearly the smallest expected prediction error among all decompositions of the coefficient matrix with the same rank. To estimate the decomposition, we solve a penalized generalized eigenvalue problem followed by a least squares procedure. In the high-dimensional setting, allowing a general covariance structure for the noise vector, we establish the oracle inequalities for the estimates. Simulation studies and application to real data show that the proposed method has good prediction performance and is efficient in dimension reduction for various models.

Suggested Citation

  • Luo, Ruiyan & Qi, Xin, 2017. "Signal extraction approach for sparse multivariate response regression," Journal of Multivariate Analysis, Elsevier, vol. 153(C), pages 83-97.
  • Handle: RePEc:eee:jmvana:v:153:y:2017:i:c:p:83-97
    DOI: 10.1016/j.jmva.2016.09.005
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0047259X16300884
    Download Restriction: Full text for ScienceDirect subscribers only

    File URL: https://libkey.io/10.1016/j.jmva.2016.09.005?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Simila, Timo & Tikka, Jarkko, 2007. "Input selection and shrinkage in multiresponse linear regression," Computational Statistics & Data Analysis, Elsevier, vol. 52(1), pages 406-422, September.
    2. Hyonho Chun & Sündüz Keleş, 2010. "Sparse partial least squares regression for simultaneous dimension reduction and variable selection," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(1), pages 3-25, January.
    3. Qi, Xin & Luo, Ruiyan & Zhao, Hongyu, 2013. "Sparse principal component analysis by choice of norm," Journal of Multivariate Analysis, Elsevier, vol. 114(C), pages 127-160.
    4. Izenman, Alan Julian, 1975. "Reduced-rank regression for the multivariate linear model," Journal of Multivariate Analysis, Elsevier, vol. 5(2), pages 248-264, June.
    5. Kun Chen & Kung‐Sik Chan & Nils Chr. Stenseth, 2012. "Reduced rank stochastic regression with a sparse singular value decomposition," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 74(2), pages 203-221, March.
    6. Lisha Chen & Jianhua Z. Huang, 2012. "Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(500), pages 1533-1545, December.
    7. Camba-Mendez, Gonzalo, et al, 2003. "Tests of Rank in Reduced Rank Regression Models," Journal of Business & Economic Statistics, American Statistical Association, vol. 21(1), pages 145-155, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dmitry Kobak & Yves Bernaerts & Marissa A. Weis & Federico Scala & Andreas S. Tolias & Philipp Berens, 2021. "Sparse reduced‐rank regression for exploratory visualisation of paired multivariate data," Journal of the Royal Statistical Society Series C, Royal Statistical Society, vol. 70(4), pages 980-1000, August.
    2. Goh, Gyuhyeong & Dey, Dipak K. & Chen, Kun, 2017. "Bayesian sparse reduced rank multivariate regression," Journal of Multivariate Analysis, Elsevier, vol. 157(C), pages 14-28.
    3. Dong, Ruipeng & Li, Daoji & Zheng, Zemin, 2021. "Parallel integrative learning for large-scale multi-response regression with incomplete outcomes," Computational Statistics & Data Analysis, Elsevier, vol. 160(C).
    4. Guo, Wenxing & Balakrishnan, Narayanaswamy & He, Mu, 2023. "Envelope-based sparse reduced-rank regression for multivariate linear model," Journal of Multivariate Analysis, Elsevier, vol. 195(C).
    5. An, Baiguo & Zhang, Beibei, 2017. "Simultaneous selection of predictors and responses for high dimensional multivariate linear regression," Statistics & Probability Letters, Elsevier, vol. 127(C), pages 173-177.
    6. Lansangan, Joseph Ryan G. & Barrios, Erniel B., 2017. "Simultaneous dimension reduction and variable selection in modeling high dimensional data," Computational Statistics & Data Analysis, Elsevier, vol. 112(C), pages 242-256.
    7. Lian, Heng & Feng, Sanying & Zhao, Kaifeng, 2015. "Parametric and semiparametric reduced-rank regression with flexible sparsity," Journal of Multivariate Analysis, Elsevier, vol. 136(C), pages 163-174.
    8. Lian, Heng & Kim, Yongdai, 2016. "Nonconvex penalized reduced rank regression and its oracle properties in high dimensions," Journal of Multivariate Analysis, Elsevier, vol. 143(C), pages 383-393.
    9. Kawano, Shuichi & Fujisawa, Hironori & Takada, Toyoyuki & Shiroishi, Toshihiko, 2015. "Sparse principal component regression with adaptive loading," Computational Statistics & Data Analysis, Elsevier, vol. 89(C), pages 192-203.
    10. Debamita Kundu & Riten Mitra & Jeremy T. Gaskins, 2021. "Bayesian variable selection for multioutcome models through shared shrinkage," Scandinavian Journal of Statistics, Danish Society for Theoretical Statistics;Finnish Statistical Society;Norwegian Statistical Association;Swedish Statistical Association, vol. 48(1), pages 295-320, March.
    11. Luo, Chongliang & Liang, Jian & Li, Gen & Wang, Fei & Zhang, Changshui & Dey, Dipak K. & Chen, Kun, 2018. "Leveraging mixed and incomplete outcomes via reduced-rank modeling," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 378-394.
    12. Feng, Sanying & Lian, Heng & Zhu, Fukang, 2016. "Reduced rank regression with possibly non-smooth criterion functions: An empirical likelihood approach," Computational Statistics & Data Analysis, Elsevier, vol. 103(C), pages 139-150.
    13. Fujikoshi, Yasunori & Sakurai, Tetsuro, 2016. "High-dimensional consistency of rank estimation criteria in multivariate linear model," Journal of Multivariate Analysis, Elsevier, vol. 149(C), pages 199-212.
    14. Minji Lee & Zhihua Su, 2020. "A Review of Envelope Models," International Statistical Review, International Statistical Institute, vol. 88(3), pages 658-676, December.
    15. Mishra, Aditya & Dey, Dipak K. & Chen, Yong & Chen, Kun, 2021. "Generalized co-sparse factor regression," Computational Statistics & Data Analysis, Elsevier, vol. 157(C).
    16. Xing Gao & Sungwon Lee & Gen Li & Sungkyu Jung, 2021. "Covariate‐driven factorization by thresholding for multiblock data," Biometrics, The International Biometric Society, vol. 77(3), pages 1011-1023, September.
    17. Canhong Wen & Zhenduo Li & Ruipeng Dong & Yijin Ni & Wenliang Pan, 2023. "Simultaneous Dimension Reduction and Variable Selection for Multinomial Logistic Regression," INFORMS Journal on Computing, INFORMS, vol. 35(5), pages 1044-1060, September.
    18. Hu, Jianhua & Liu, Xiaoqian & Liu, Xu & Xia, Ningning, 2022. "Some aspects of response variable selection and estimation in multivariate linear regression," Journal of Multivariate Analysis, Elsevier, vol. 188(C).
    19. Bai, Ray & Ghosh, Malay, 2018. "High-dimensional multivariate posterior consistency under global–local shrinkage priors," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 157-170.
    20. Zhao, Weihua & Jiang, Xuejun & Lian, Heng, 2018. "A principal varying-coefficient model for quantile regression: Joint variable selection and dimension reduction," Computational Statistics & Data Analysis, Elsevier, vol. 127(C), pages 269-280.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:153:y:2017:i:c:p:83-97. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.