IDEAS home Printed from https://ideas.repec.org/a/eee/csdana/v116y2017icp1-18.html

Trace regression model with simultaneously low rank and row(column) sparse parameter

Author

Listed:
  • Zhao, Junlong
  • Niu, Lu
  • Zhan, Shushi

Abstract

In this paper, we consider the trace regression model with matrix covariates, where the parameter is a matrix of simultaneously low rank and row(column) sparse. To estimate the parameter, we formulate a convex optimization problem with the nuclear norm and group Lasso penalties, and propose an alternating direction method of multipliers (ADMM) algorithm. The asymptotic properties of the estimator are established. Simulation results confirm the effectiveness of our method.

Suggested Citation

  • Zhao, Junlong & Niu, Lu & Zhan, Shushi, 2017. "Trace regression model with simultaneously low rank and row(column) sparse parameter," Computational Statistics & Data Analysis, Elsevier, vol. 116(C), pages 1-18.
  • Handle: RePEc:eee:csdana:v:116:y:2017:i:c:p:1-18
    DOI: 10.1016/j.csda.2017.06.009
    as

    Download full text from publisher

    File URL: http://www.sciencedirect.com/science/article/pii/S0167947317301482
    Download Restriction: Full text for ScienceDirect subscribers only.

    File URL: https://libkey.io/10.1016/j.csda.2017.06.009?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Jianhui Chen & Jieping Ye, 2014. "Sparse trace norm regularization," Computational Statistics, Springer, vol. 29(3), pages 623-639, June.
    2. Hua Zhou & Lexin Li & Hongtu Zhu, 2013. "Tensor Regression with Applications in Neuroimaging Data Analysis," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 108(502), pages 540-552, June.
    3. Hua Zhou & Lexin Li, 2014. "Regularized matrix regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 76(2), pages 463-483, March.
    4. Ming Yuan & Yi Lin, 2006. "Model selection and estimation in regression with grouped variables," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 68(1), pages 49-67, February.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Wang, Lei & Zhang, Jing & Li, Bo & Liu, Xiaohui, 2022. "Quantile trace regression via nuclear norm regularization," Statistics & Probability Letters, Elsevier, vol. 182(C).
    2. Yiting Ma & Pan Shang & Lingchen Kong, 2025. "Tuning parameter selection for the adaptive nuclear norm regularized trace regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 77(3), pages 491-516, June.
    3. Ling Peng & Xiaohui Liu & Xiangyong Tan & Yiweng Zhou & Shihua Luo, 2024. "The statistical rate for support matrix machines under low rankness and row (column) sparsity," Statistical Papers, Springer, vol. 65(7), pages 4567-4598, September.
    4. Xiumin Liu & Lu Niu & Junlong Zhao, 2023. "Statistical inference on the significance of rows and columns for matrix-valued data in an additive model," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(3), pages 785-828, September.
    5. Zhou, Chengyu & Fang, Xiaolei, 2023. "A convex two-dimensional variable selection method for the root-cause diagnostics of product defects," Reliability Engineering and System Safety, Elsevier, vol. 229(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Xiaoshan Li & Da Xu & Hua Zhou & Lexin Li, 2018. "Tucker Tensor Regression and Neuroimaging Analysis," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 10(3), pages 520-545, December.
    2. Luo, Chongliang & Liang, Jian & Li, Gen & Wang, Fei & Zhang, Changshui & Dey, Dipak K. & Chen, Kun, 2018. "Leveraging mixed and incomplete outcomes via reduced-rank modeling," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 378-394.
    3. Yiting Ma & Pan Shang & Lingchen Kong, 2025. "Tuning parameter selection for the adaptive nuclear norm regularized trace regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 77(3), pages 491-516, June.
    4. Xiumin Liu & Lu Niu & Junlong Zhao, 2023. "Statistical inference on the significance of rows and columns for matrix-valued data in an additive model," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 32(3), pages 785-828, September.
    5. Wang, Lei & Zhang, Jing & Li, Bo & Liu, Xiaohui, 2022. "Quantile trace regression via nuclear norm regularization," Statistics & Probability Letters, Elsevier, vol. 182(C).
    6. Xin Li & Dongya Wu, 2024. "Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression," Journal of Global Optimization, Springer, vol. 88(1), pages 79-114, January.
    7. Xin Li & Dongya Wu, 2025. "Low-Rank Matrix Recovery Via Nonconvex Optimization Methods with Application to Errors-in-Variables Matrix Regression," Journal of Optimization Theory and Applications, Springer, vol. 205(3), pages 1-27, June.
    8. Chen, Yang & Luo, Ziyan & Kong, Lingchen, 2024. "Low-rank tensor regression for selection of grouped variables," Journal of Multivariate Analysis, Elsevier, vol. 203(C).
    9. Zengchao Xu & Shan Luo & Zehua Chen, 2023. "A Portmanteau Local Feature Discrimination Approach to the Classification with High-dimensional Matrix-variate Data," Sankhya A: The Indian Journal of Statistics, Springer;Indian Statistical Institute, vol. 85(1), pages 441-467, February.
    10. Philip T. Reiss & Jeff Goldsmith & Han Lin Shang & R. Todd Ogden, 2017. "Methods for Scalar-on-Function Regression," International Statistical Review, International Statistical Institute, vol. 85(2), pages 228-249, August.
    11. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    12. Lin Liu, 2021. "Matrix‐based introduction to multivariate data analysis, by KoheiAdachi 2nd edition. Singapore: Springer Nature, 2020. pp. 457," Biometrics, The International Biometric Society, vol. 77(4), pages 1498-1500, December.
    13. Zhaoxing Gao & Ruey S. Tsay, 2021. "Divide-and-Conquer: A Distributed Hierarchical Factor Approach to Modeling Large-Scale Time Series Data," Papers 2103.14626, arXiv.org.
    14. Jun Yan & Jian Huang, 2012. "Model Selection for Cox Models with Time-Varying Coefficients," Biometrics, The International Biometric Society, vol. 68(2), pages 419-428, June.
    15. Ye, Ya-Fen & Shao, Yuan-Hai & Deng, Nai-Yang & Li, Chun-Na & Hua, Xiang-Yu, 2017. "Robust Lp-norm least squares support vector regression with feature selection," Applied Mathematics and Computation, Elsevier, vol. 305(C), pages 32-52.
    16. Guillaume Sagnol & Edouard Pauwels, 2019. "An unexpected connection between Bayes A-optimal designs and the group lasso," Statistical Papers, Springer, vol. 60(2), pages 565-584, April.
    17. Diego Vidaurre & Concha Bielza & Pedro Larrañaga, 2013. "A Survey of L1 Regression," International Statistical Review, International Statistical Institute, vol. 81(3), pages 361-387, December.
    18. Nickolay Trendafilov & Martin Kleinsteuber & Hui Zou, 2014. "Sparse matrices in data analysis," Computational Statistics, Springer, vol. 29(3), pages 403-405, June.
    19. Bakalli, Gaetan & Guerrier, Stéphane & Scaillet, Olivier, 2023. "A penalized two-pass regression to predict stock returns with time-varying risk premia," Journal of Econometrics, Elsevier, vol. 237(2).
    20. Shuichi Kawano, 2014. "Selection of tuning parameters in bridge regression models via Bayesian information criterion," Statistical Papers, Springer, vol. 55(4), pages 1207-1223, November.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eee:csdana:v:116:y:2017:i:c:p:1-18. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Catherine Liu (email available below). General contact details of provider: http://www.elsevier.com/locate/csda .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.