IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v140y2009i3d10.1007_s10957-008-9458-3.html
   My bibliography  Save this article

Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization

Author

Listed:
  • P. Tseng

    (University of Washington)

  • S. Yun

    (National University of Singapore)

Abstract

We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell-q rule based on sufficient predicted descent. We establish global convergence to first-order stationarity for this method and, under a local error bound assumption, linear rate of convergence. If f is convex with Lipschitz continuous gradient, then the method terminates in O(n 2/ε) iterations with an ε-optimal solution. If P is separable, then the Gauss-Southwell-q rule is implementable in O(n) operations when m=1 and in O(n 2) operations when m>1. In the special case of support vector machines training, for which f is convex quadratic, P is separable, and m=1, this complexity bound is comparable to the best known bound for decomposition methods. If f is convex, then, by gradually reducing the weight on P to zero, the method can be adapted to solve the bilevel problem of minimizing P over the set of minima of f+δ X , where X denotes the closure of the feasible set. This has application in the least 1-norm solution of maximum-likelihood estimation.

Suggested Citation

  • P. Tseng & S. Yun, 2009. "Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization," Journal of Optimization Theory and Applications, Springer, vol. 140(3), pages 513-535, March.
  • Handle: RePEc:spr:joptap:v:140:y:2009:i:3:d:10.1007_s10957-008-9458-3
    DOI: 10.1007/s10957-008-9458-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-008-9458-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-008-9458-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. P. Tseng, 2001. "Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization," Journal of Optimization Theory and Applications, Springer, vol. 109(3), pages 475-494, June.
    2. C. J. Lin & S. Lucidi & L. Palagi & A. Risi & M. Sciandrone, 2009. "Decomposition Algorithm Model for Singly Linearly-Constrained Problems Subject to Lower and Upper Bounds," Journal of Optimization Theory and Applications, Springer, vol. 141(1), pages 107-126, April.
    3. Lukas Meier & Sara Van De Geer & Peter Bühlmann, 2008. "The group lasso for logistic regression," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 70(1), pages 53-71, February.
    4. Zhi-Quan Luo & Paul Tseng, 1993. "On the Convergence Rate of Dual Ascent Methods for Linearly Constrained Convex Minimization," Mathematics of Operations Research, INFORMS, vol. 18(4), pages 846-867, November.
    5. K. C. Kiwiel, 2007. "On Linear-Time Algorithms for the Continuous Quadratic Knapsack Problem," Journal of Optimization Theory and Applications, Springer, vol. 134(3), pages 549-554, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Fan, Xinyan & Zhang, Qingzhao & Ma, Shuangge & Fang, Kuangnan, 2021. "Conditional score matching for high-dimensional partial graphical models," Computational Statistics & Data Analysis, Elsevier, vol. 153(C).
    2. Baiguo An & Beibei Zhang, 2020. "Logistic regression with image covariates via the combination of L1 and Sobolev regularizations," PLOS ONE, Public Library of Science, vol. 15(6), pages 1-18, June.
    3. Kaida Cai & Hua Shen & Xuewen Lu, 2022. "Adaptive bi-level variable selection for multivariate failure time model with a diverging number of covariates," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 31(4), pages 968-993, December.
    4. Cassioli, A. & Di Lorenzo, D. & Sciandrone, M., 2013. "On the convergence of inexact block coordinate descent methods for constrained optimization," European Journal of Operational Research, Elsevier, vol. 231(2), pages 274-281.
    5. Vincent, Martin & Hansen, Niels Richard, 2014. "Sparse group lasso and high dimensional multinomial classification," Computational Statistics & Data Analysis, Elsevier, vol. 71(C), pages 771-786.
    6. Dewei Zhang & Yin Liu & Sam Davanloo Tajbakhsh, 2022. "A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure," INFORMS Journal on Computing, INFORMS, vol. 34(2), pages 1126-1140, March.
    7. Paul Tseng & Sangwoon Yun, 2010. "A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training," Computational Optimization and Applications, Springer, vol. 47(2), pages 179-206, October.
    8. Ion Necoara & Andrei Patrascu, 2014. "A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints," Computational Optimization and Applications, Springer, vol. 57(2), pages 307-337, March.
    9. Mingrui Zhong & Zanhua Yin & Zhichao Wang, 2023. "Variable Selection for Sparse Logistic Regression with Grouped Variables," Mathematics, MDPI, vol. 11(24), pages 1-21, December.
    10. Mingyi Hong & Tsung-Hui Chang & Xiangfeng Wang & Meisam Razaviyayn & Shiqian Ma & Zhi-Quan Luo, 2020. "A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization," Mathematics of Operations Research, INFORMS, vol. 45(3), pages 833-861, August.
    11. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    12. Shuang Zhang & Xingdong Feng, 2022. "Distributed identification of heterogeneous treatment effects," Computational Statistics, Springer, vol. 37(1), pages 57-89, March.
    13. Jung, Yoon Mo & Whang, Joyce Jiyoung & Yun, Sangwoon, 2020. "Sparse probabilistic K-means," Applied Mathematics and Computation, Elsevier, vol. 382(C).
    14. Seunghwan Lee & Sang Cheol Kim & Donghyeon Yu, 2023. "An efficient GPU-parallel coordinate descent algorithm for sparse precision matrix estimation via scaled lasso," Computational Statistics, Springer, vol. 38(1), pages 217-242, March.
    15. Le Thi Khanh Hien & Duy Nhat Phan & Nicolas Gillis, 2022. "Inertial alternating direction method of multipliers for non-convex non-smooth optimization," Computational Optimization and Applications, Springer, vol. 83(1), pages 247-285, September.
    16. Victor Chernozhukov & Whitney K. Newey & Victor Quintas-Martinez & Vasilis Syrgkanis, 2021. "Automatic Debiased Machine Learning via Riesz Regression," Papers 2104.14737, arXiv.org, revised Mar 2024.
    17. repec:jss:jstsof:33:i01 is not listed on IDEAS
    18. Bilin Zeng & Xuerong Meggie Wen & Lixing Zhu, 2017. "A link-free sparse group variable selection method for single-index model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 44(13), pages 2388-2400, October.
    19. Olga Klopp & Marianna Pensky, 2013. "Sparse High-dimensional Varying Coefficient Model : Non-asymptotic Minimax Study," Working Papers 2013-30, Center for Research in Economics and Statistics.
    20. Li, Peili & Jiao, Yuling & Lu, Xiliang & Kang, Lican, 2022. "A data-driven line search rule for support recovery in high-dimensional data analysis," Computational Statistics & Data Analysis, Elsevier, vol. 174(C).
    21. Osamu Komori & Shinto Eguchi & John B. Copas, 2015. "Generalized t-statistic for two-group classification," Biometrics, The International Biometric Society, vol. 71(2), pages 404-416, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:140:y:2009:i:3:d:10.1007_s10957-008-9458-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.