IDEAS home Printed from https://ideas.repec.org/a/spr/compst/v29y2014i3p591-621.html
   My bibliography  Save this article

Fixed-rank matrix factorizations and Riemannian low-rank optimization

Author

Listed:
  • Bamdev Mishra
  • Gilles Meyer
  • Silvère Bonnabel
  • Rodolphe Sepulchre

Abstract

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian quotient geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems, and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss the usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. Copyright Springer-Verlag Berlin Heidelberg 2014

Suggested Citation

  • Bamdev Mishra & Gilles Meyer & Silvère Bonnabel & Rodolphe Sepulchre, 2014. "Fixed-rank matrix factorizations and Riemannian low-rank optimization," Computational Statistics, Springer, vol. 29(3), pages 591-621, June.
  • Handle: RePEc:spr:compst:v:29:y:2014:i:3:p:591-621
    DOI: 10.1007/s00180-013-0464-z
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10.1007/s00180-013-0464-z
    Download Restriction: Access to full text is restricted to subscribers.

    File URL: https://libkey.io/10.1007/s00180-013-0464-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ming Yuan & Ali Ekici & Zhaosong Lu & Renato Monteiro, 2007. "Dimension reduction and coefficient estimation in multivariate linear regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 69(3), pages 329-346, June.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Nickolay Trendafilov & Martin Kleinsteuber & Hui Zou, 2014. "Sparse matrices in data analysis," Computational Statistics, Springer, vol. 29(3), pages 403-405, June.
    2. Marie Billaud-Friess & Antonio Falcó & Anthony Nouy, 2021. "Principal Bundle Structure of Matrix Manifolds," Mathematics, MDPI, vol. 9(14), pages 1-17, July.
    3. Ke Wang & Zhuo Chen & Shihui Ying & Xinjian Xu, 2023. "Low-Rank Matrix Completion via QR-Based Retraction on Manifolds," Mathematics, MDPI, vol. 11(5), pages 1-17, February.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Donghwi Nam & Ja-Yong Koo & Kwan-Young Bak, 2025. "Dimensionality reduction in multivariate nonparametric regression via nuclear norm penalization," Statistical Papers, Springer, vol. 66(3), pages 1-33, April.
    2. Lee, Wonyul & Liu, Yufeng, 2012. "Simultaneous multiple response regression and inverse covariance matrix estimation via penalized Gaussian maximum likelihood," Journal of Multivariate Analysis, Elsevier, vol. 111(C), pages 241-255.
    3. Chen, Canyi & Xu, Wangli & Zhu, Liping, 2022. "Distributed estimation in heterogeneous reduced rank regression: With application to order determination in sufficient dimension reduction," Journal of Multivariate Analysis, Elsevier, vol. 190(C).
    4. Matsui, Hidetoshi, 2014. "Variable and boundary selection for functional data via multiclass logistic regression modeling," Computational Statistics & Data Analysis, Elsevier, vol. 78(C), pages 176-185.
    5. Zehua Chen & Yiwei Jiang, 2020. "A two-stage sequential conditional selection approach to sparse high-dimensional multivariate regression models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 72(1), pages 65-90, February.
    6. Luo, Chongliang & Liang, Jian & Li, Gen & Wang, Fei & Zhang, Changshui & Dey, Dipak K. & Chen, Kun, 2018. "Leveraging mixed and incomplete outcomes via reduced-rank modeling," Journal of Multivariate Analysis, Elsevier, vol. 167(C), pages 378-394.
    7. An, Baiguo & Zhang, Beibei, 2017. "Simultaneous selection of predictors and responses for high dimensional multivariate linear regression," Statistics & Probability Letters, Elsevier, vol. 127(C), pages 173-177.
    8. Miyashiro, Ryuhei & Takano, Yuichi, 2015. "Mixed integer second-order cone programming formulations for variable selection in linear regression," European Journal of Operational Research, Elsevier, vol. 247(3), pages 721-731.
    9. Fujikoshi, Yasunori & Sakurai, Tetsuro, 2016. "High-dimensional consistency of rank estimation criteria in multivariate linear model," Journal of Multivariate Analysis, Elsevier, vol. 149(C), pages 199-212.
    10. repec:hum:wpaper:sfb649dp2016-018 is not listed on IDEAS
    11. Changliang Zou & Xianghui Ning & Fugee Tsung, 2012. "LASSO-based multivariate linear profile monitoring," Annals of Operations Research, Springer, vol. 192(1), pages 3-19, January.
    12. Yiting Ma & Pan Shang & Lingchen Kong, 2025. "Tuning parameter selection for the adaptive nuclear norm regularized trace regression," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 77(3), pages 491-516, June.
    13. Chao, Shih-Kang & Härdle, Wolfgang K. & Huang, Chen, 2018. "Multivariate factorizable expectile regression with application to fMRI data," Computational Statistics & Data Analysis, Elsevier, vol. 121(C), pages 1-19.
    14. Mishra, Aditya & Dey, Dipak K. & Chen, Yong & Chen, Kun, 2021. "Generalized co-sparse factor regression," Computational Statistics & Data Analysis, Elsevier, vol. 157(C).
    15. Marie Levakova & Susanne Ditlevsen, 2024. "Penalisation Methods in Fitting High‐Dimensional Cointegrated Vector Autoregressive Models: A Review," International Statistical Review, International Statistical Institute, vol. 92(2), pages 160-193, August.
    16. Pan Shang & Lingchen Kong, 2021. "Regularization Parameter Selection for the Low Rank Matrix Recovery," Journal of Optimization Theory and Applications, Springer, vol. 189(3), pages 772-792, June.
    17. Qiu, Yue & Zheng, Yuchen, 2023. "Improving box office projections through sentiment analysis: Insights from regularization-based forecast combinations," Economic Modelling, Elsevier, vol. 125(C).
    18. Li, Mei & Kong, Lingchen, 2019. "Double fused Lasso penalized LAD for matrix regression," Applied Mathematics and Computation, Elsevier, vol. 357(C), pages 119-138.
    19. Vladimir M. Cvetković & Neda Nikolić & Adem Ocal & Jovana Martinović & Aleksandar Dragašević, 2022. "A Predictive Model of Pandemic Disaster Fear Caused by Coronavirus (COVID-19): Implications for Decision-Makers," IJERPH, MDPI, vol. 19(2), pages 1-27, January.
    20. Jin Liu & Jian Huang & Shuangge Ma, 2012. "Analysis of Genome-Wide Association Studies with Multiple Outcomes Using Penalization," PLOS ONE, Public Library of Science, vol. 7(12), pages 1-12, December.
    21. Goh, Gyuhyeong & Dey, Dipak K. & Chen, Kun, 2017. "Bayesian sparse reduced rank multivariate regression," Journal of Multivariate Analysis, Elsevier, vol. 157(C), pages 14-28.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:compst:v:29:y:2014:i:3:p:591-621. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.