IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v195y2022i3d10.1007_s10957-022-02098-9.html
   My bibliography  Save this article

An Adaptive Alternating Direction Method of Multipliers

Author

Listed:
  • Sedi Bartz

    (University of Massachusetts Lowell)

  • Rubén Campoy

    (Universitat de València)

  • Hung M. Phan

    (University of Massachusetts Lowell)

Abstract

The alternating direction method of multipliers (ADMM) is a powerful splitting algorithm for linearly constrained convex optimization problems. In view of its popularity and applicability, a growing attention is drawn toward the ADMM in nonconvex settings. Recent studies of minimization problems for nonconvex functions include various combinations of assumptions on the objective function including, in particular, a Lipschitz gradient assumption. We consider the case where the objective is the sum of a strongly convex function and a weakly convex function. To this end, we present and study an adaptive version of the ADMM which incorporates generalized notions of convexity and penalty parameters adapted to the convexity constants of the functions. We prove convergence of the scheme under natural assumptions. To this end, we employ the recent adaptive Douglas–Rachford algorithm by revisiting the well-known duality relation between the classical ADMM and the Douglas–Rachford splitting algorithm, generalizing this connection to our setting. We illustrate our approach by relating and comparing to alternatives, and by numerical experiments on a signal denoising problem.

Suggested Citation

  • Sedi Bartz & Rubén Campoy & Hung M. Phan, 2022. "An Adaptive Alternating Direction Method of Multipliers," Journal of Optimization Theory and Applications, Springer, vol. 195(3), pages 1019-1055, December.
  • Handle: RePEc:spr:joptap:v:195:y:2022:i:3:d:10.1007_s10957-022-02098-9
    DOI: 10.1007/s10957-022-02098-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-022-02098-9
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-022-02098-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Liang Chen & Defeng Sun & Kim-Chuan Toh, 2017. "A note on the convergence of ADMM for linearly constrained convex optimization problems," Computational Optimization and Applications, Springer, vol. 66(2), pages 327-343, March.
    2. Patrick L. Combettes & Jean-Christophe Pesquet, 2011. "Proximal Splitting Methods in Signal Processing," Springer Optimization and Its Applications, in: Heinz H. Bauschke & Regina S. Burachik & Patrick L. Combettes & Veit Elser & D. Russell Luke & Henry (ed.), Fixed-Point Algorithms for Inverse Problems in Science and Engineering, chapter 0, pages 185-212, Springer.
    3. Ernest K. Ryu & Yanli Liu & Wotao Yin, 2019. "Douglas–Rachford splitting and ADMM for pathological convex optimization," Computational Optimization and Applications, Springer, vol. 74(3), pages 747-778, December.
    4. Sedi Bartz & Rubén Campoy & Hung M. Phan, 2020. "Demiclosedness Principles for Generalized Nonexpansive Mappings," Journal of Optimization Theory and Applications, Springer, vol. 186(3), pages 759-778, September.
    5. Sedi Bartz & Minh N. Dao & Hung M. Phan, 2022. "Conical averagedness and convergence analysis of fixed point algorithms," Journal of Global Optimization, Springer, vol. 82(2), pages 351-373, February.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ernest K. Ryu & Yanli Liu & Wotao Yin, 2019. "Douglas–Rachford splitting and ADMM for pathological convex optimization," Computational Optimization and Applications, Springer, vol. 74(3), pages 747-778, December.
    2. Guillaume Sagnol & Edouard Pauwels, 2019. "An unexpected connection between Bayes A-optimal designs and the group lasso," Statistical Papers, Springer, vol. 60(2), pages 565-584, April.
    3. Junhong Lin & Lorenzo Rosasco & Silvia Villa & Ding-Xuan Zhou, 2018. "Modified Fejér sequences and applications," Computational Optimization and Applications, Springer, vol. 71(1), pages 95-113, September.
    4. Silvia Bonettini & Peter Ochs & Marco Prato & Simone Rebegoldi, 2023. "An abstract convergence framework with application to inertial inexact forward–backward methods," Computational Optimization and Applications, Springer, vol. 84(2), pages 319-362, March.
    5. Puya Latafat & Panagiotis Patrinos, 2017. "Asymmetric forward–backward–adjoint splitting for solving monotone inclusions involving three operators," Computational Optimization and Applications, Springer, vol. 68(1), pages 57-93, September.
    6. Hedy Attouch & Alexandre Cabot & Zaki Chbani & Hassan Riahi, 2018. "Inertial Forward–Backward Algorithms with Perturbations: Application to Tikhonov Regularization," Journal of Optimization Theory and Applications, Springer, vol. 179(1), pages 1-36, October.
    7. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    8. Suthep Suantai & Kunrada Kankam & Prasit Cholamjiak, 2020. "A Novel Forward-Backward Algorithm for Solving Convex Minimization Problem in Hilbert Spaces," Mathematics, MDPI, vol. 8(1), pages 1-13, January.
    9. Wang, Yugang & Huang, Ting-Zhu & Zhao, Xi-Le & Deng, Liang-Jian & Ji, Teng-Yu, 2020. "A convex single image dehazing model via sparse dark channel prior," Applied Mathematics and Computation, Elsevier, vol. 375(C).
    10. Julian Rasch & Antonin Chambolle, 2020. "Inexact first-order primal–dual algorithms," Computational Optimization and Applications, Springer, vol. 76(2), pages 381-430, June.
    11. Sun, Shilin & Wang, Tianyang & Yang, Hongxing & Chu, Fulei, 2022. "Damage identification of wind turbine blades using an adaptive method for compressive beamforming based on the generalized minimax-concave penalty function," Renewable Energy, Elsevier, vol. 181(C), pages 59-70.
    12. S. Bonettini & M. Prato & S. Rebegoldi, 2018. "A block coordinate variable metric linesearch based proximal gradient method," Computational Optimization and Applications, Springer, vol. 71(1), pages 5-52, September.
    13. David Degras, 2021. "Sparse group fused lasso for model segmentation: a hybrid approach," Advances in Data Analysis and Classification, Springer;German Classification Society - Gesellschaft für Klassifikation (GfKl);Japanese Classification Society (JCS);Classification and Data Analysis Group of the Italian Statistical Society (CLADAG);International Federation of Classification Societies (IFCS), vol. 15(3), pages 625-671, September.
    14. Yao, Yu & Zhu, Xiaoning & Dong, Hongyu & Wu, Shengnan & Wu, Hailong & Carol Tong, Lu & Zhou, Xuesong, 2019. "ADMM-based problem decomposition scheme for vehicle routing problem with time windows," Transportation Research Part B: Methodological, Elsevier, vol. 129(C), pages 156-174.
    15. Anda Tang & Pei Quan & Lingfeng Niu & Yong Shi, 2022. "A Survey for Sparse Regularization Based Compression Methods," Annals of Data Science, Springer, vol. 9(4), pages 695-722, August.
    16. Christian Grussler & Pontus Giselsson, 2022. "Efficient Proximal Mapping Computation for Low-Rank Inducing Norms," Journal of Optimization Theory and Applications, Springer, vol. 192(1), pages 168-194, January.
    17. Bonettini, Silvia & Prato, Marco & Rebegoldi, Simone, 2016. "A cyclic block coordinate descent method with generalized gradient projections," Applied Mathematics and Computation, Elsevier, vol. 286(C), pages 288-300.
    18. Nguyen Hieu Thao, 2018. "A convergent relaxation of the Douglas–Rachford algorithm," Computational Optimization and Applications, Springer, vol. 70(3), pages 841-863, July.
    19. Rodrigo Verschae & Takekazu Kato & Takashi Matsuyama, 2016. "Energy Management in Prosumer Communities: A Coordinated Approach," Energies, MDPI, vol. 9(7), pages 1-27, July.
    20. Jérôme Bolte & Edouard Pauwels, 2016. "Majorization-Minimization Procedures and Convergence of SQP Methods for Semi-Algebraic and Tame Programs," Mathematics of Operations Research, INFORMS, vol. 41(2), pages 442-465, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:195:y:2022:i:3:d:10.1007_s10957-022-02098-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.