IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v70y2018i1d10.1007_s10589-017-9971-0.html
   My bibliography  Save this article

Generalized symmetric ADMM for separable convex optimization

Author

Listed:
  • Jianchao Bai

    (Xi’an Jiaotong University)

  • Jicheng Li

    (Xi’an Jiaotong University)

  • Fengmin Xu

    (Xi’an Jiaotong University)

  • Hongchao Zhang

    (Louisiana State University)

Abstract

The alternating direction method of multipliers (ADMM) has been proved to be effective for solving separable convex optimization subject to linear constraints. In this paper, we propose a generalized symmetric ADMM (GS-ADMM), which updates the Lagrange multiplier twice with suitable stepsizes, to solve the multi-block separable convex programming. This GS-ADMM partitions the data into two group variables so that one group consists of p block variables while the other has q block variables, where $$p \ge 1$$ p ≥ 1 and $$q \ge 1$$ q ≥ 1 are two integers. The two grouped variables are updated in a Gauss–Seidel scheme, while the variables within each group are updated in a Jacobi scheme, which would make it very attractive for a big data setting. By adding proper proximal terms to the subproblems, we specify the domain of the stepsizes to guarantee that GS-ADMM is globally convergent with a worst-case $${\mathcal {O}}(1/t)$$ O ( 1 / t ) ergodic convergence rate. It turns out that our convergence domain of the stepsizes is significantly larger than other convergence domains in the literature. Hence, the GS-ADMM is more flexible and attractive on choosing and using larger stepsizes of the dual variable. Besides, two special cases of GS-ADMM, which allows using zero penalty terms, are also discussed and analyzed. Compared with several state-of-the-art methods, preliminary numerical experiments on solving a sparse matrix minimization problem in the statistical learning show that our proposed method is effective and promising.

Suggested Citation

  • Jianchao Bai & Jicheng Li & Fengmin Xu & Hongchao Zhang, 2018. "Generalized symmetric ADMM for separable convex optimization," Computational Optimization and Applications, Springer, vol. 70(1), pages 129-170, May.
  • Handle: RePEc:spr:coopap:v:70:y:2018:i:1:d:10.1007_s10589-017-9971-0
    DOI: 10.1007/s10589-017-9971-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-017-9971-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-017-9971-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Maryam Yashtini, 2022. "Convergence and rate analysis of a proximal linearized ADMM for nonconvex nonsmooth optimization," Journal of Global Optimization, Springer, vol. 84(4), pages 913-939, December.
    2. Shengjie Xu & Bingsheng He, 2021. "A parallel splitting ALM-based algorithm for separable convex programming," Computational Optimization and Applications, Springer, vol. 80(3), pages 831-851, December.
    3. Jianchao Bai & William W. Hager & Hongchao Zhang, 2022. "An inexact accelerated stochastic ADMM for separable convex optimization," Computational Optimization and Applications, Springer, vol. 81(2), pages 479-518, March.
    4. Peixuan Li & Yuan Shen & Suhong Jiang & Zehua Liu & Caihua Chen, 2021. "Convergence study on strictly contractive Peaceman–Rachford splitting method for nonseparable convex minimization models with quadratic coupling terms," Computational Optimization and Applications, Springer, vol. 78(1), pages 87-124, January.
    5. Yaning Jiang & Deren Han & Xingju Cai, 2022. "An efficient partial parallel method with scaling step size strategy for three-block convex optimization problems," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 96(3), pages 383-419, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:70:y:2018:i:1:d:10.1007_s10589-017-9971-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.