IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v157y2013i1d10.1007_s10957-012-0158-7.html
   My bibliography  Save this article

An Inexact Modified Subgradient Algorithm for Primal-Dual Problems via Augmented Lagrangians

Author

Listed:
  • Regina S. Burachik

    (University of South Australia)

  • Alfredo N. Iusem

    (Instituto Nacional de Matemática Pura e Aplicada)

  • Jefferson G. Melo

    (Universidade Federal de Goiás)

Abstract

We consider a primal optimization problem in a reflexive Banach space and a duality scheme via generalized augmented Lagrangians. For solving the dual problem (in a Hilbert space), we introduce and analyze a new parameterized Inexact Modified Subgradient (IMSg) algorithm. The IMSg generates a primal-dual sequence, and we focus on two simple new choices of the stepsize. We prove that every weak accumulation point of the primal sequence is a primal solution and the dual sequence converges weakly to a dual solution, as long as the dual optimal set is nonempty. Moreover, we establish primal convergence even when the dual optimal set is empty. Our second choice of the stepsize gives rise to a variant of IMSg which has finite termination.

Suggested Citation

  • Regina S. Burachik & Alfredo N. Iusem & Jefferson G. Melo, 2013. "An Inexact Modified Subgradient Algorithm for Primal-Dual Problems via Augmented Lagrangians," Journal of Optimization Theory and Applications, Springer, vol. 157(1), pages 108-131, April.
  • Handle: RePEc:spr:joptap:v:157:y:2013:i:1:d:10.1007_s10957-012-0158-7
    DOI: 10.1007/s10957-012-0158-7
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-012-0158-7
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-012-0158-7?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Regina S. Burachik & C. Yalçın Kaya, 2010. "A Deflected Subgradient Method Using a General Augmented Lagrangian Duality with Implications on Penalty Methods," Springer Optimization and Its Applications, in: Regina S. Burachik & Jen-Chih Yao (ed.), Variational Analysis and Generalized Differentiation in Optimization and Control, pages 109-132, Springer.
    2. Y. Y. Zhou & X. Q. Yang, 2009. "Duality and Penalization in Optimization via an Augmented Lagrangian Function with Applications," Journal of Optimization Theory and Applications, Springer, vol. 140(1), pages 171-188, January.
    3. Regina Burachik & Alfredo Iusem & Jefferson Melo, 2010. "A primal dual modified subgradient algorithm with sharp Lagrangian," Journal of Global Optimization, Springer, vol. 46(3), pages 347-361, March.
    4. R. S. Burachik & A. N. Iusem & J. G. Melo, 2010. "Duality and Exact Penalization for General Augmented Lagrangians," Journal of Optimization Theory and Applications, Springer, vol. 147(1), pages 125-140, October.
    5. Regina S. Burachik & Alfredo N. Iusem, 2008. "Set-Valued Mappings and Enlargements of Monotone Operators," Springer Optimization and Its Applications, Springer, number 978-0-387-69757-4, September.
    6. Regina S. Burachik & Alfredo N. Iusem, 2008. "Enlargements of Monotone Operators," Springer Optimization and Its Applications, in: Set-Valued Mappings and Enlargements of Monotone Operators, chapter 0, pages 161-220, Springer.
    7. X. X. Huang & X. Q. Yang, 2003. "A Unified Augmented Lagrangian Approach to Duality and Exact Penalization," Mathematics of Operations Research, INFORMS, vol. 28(3), pages 533-552, August.
    8. A. Nedić & A. Ozdaglar, 2009. "Subgradient Methods for Saddle-Point Problems," Journal of Optimization Theory and Applications, Springer, vol. 142(1), pages 205-228, July.
    9. Regina Burachik & C. Kaya & Musa Mammadov, 2010. "An inexact modified subgradient algorithm for nonconvex optimization," Computational Optimization and Applications, Springer, vol. 45(1), pages 1-24, January.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Regina Burachik & Wilhelm Freire & C. Kaya, 2014. "Interior Epigraph Directions method for nonsmooth and nonconvex optimization via generalized augmented Lagrangian duality," Journal of Global Optimization, Springer, vol. 60(3), pages 501-529, November.
    2. Yu Zhou & Jin Zhou & Xiao Yang, 2014. "Existence of augmented Lagrange multipliers for cone constrained optimization problems," Journal of Global Optimization, Springer, vol. 58(2), pages 243-260, February.
    3. L. F. Bueno & G. Haeser & J. M. Martínez, 2015. "A Flexible Inexact-Restoration Method for Constrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 188-208, April.
    4. Huynh Van Ngai & Nguyen Huu Tron & Michel Théra, 2014. "Metric Regularity of the Sum of Multifunctions and Applications," Journal of Optimization Theory and Applications, Springer, vol. 160(2), pages 355-390, February.
    5. Dawan Chumpungam & Panitarn Sarnmeta & Suthep Suantai, 2021. "A New Forward–Backward Algorithm with Line Searchand Inertial Techniques for Convex Minimization Problems with Applications," Mathematics, MDPI, vol. 9(13), pages 1-20, July.
    6. Walaa M. Moursi & Lieven Vandenberghe, 2019. "Douglas–Rachford Splitting for the Sum of a Lipschitz Continuous and a Strongly Monotone Operator," Journal of Optimization Theory and Applications, Springer, vol. 183(1), pages 179-198, October.
    7. Sedi Bartz & Minh N. Dao & Hung M. Phan, 2022. "Conical averagedness and convergence analysis of fixed point algorithms," Journal of Global Optimization, Springer, vol. 82(2), pages 351-373, February.
    8. Bello Cruz, J.Y. & Iusem, A.N., 2015. "Full convergence of an approximate projection method for nonsmooth variational inequalities," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 114(C), pages 2-13.
    9. Regina S. Burachik & Minh N. Dao & Scott B. Lindstrom, 2021. "Generalized Bregman Envelopes and Proximity Operators," Journal of Optimization Theory and Applications, Springer, vol. 190(3), pages 744-778, September.
    10. Warunun Inthakon & Suthep Suantai & Panitarn Sarnmeta & Dawan Chumpungam, 2020. "A New Machine Learning Algorithm Based on Optimization Method for Regression and Classification Problems," Mathematics, MDPI, vol. 8(6), pages 1-17, June.
    11. Juan Pablo Luna & Claudia Sagastizábal & Mikhail Solodov, 2020. "A class of Benders decomposition methods for variational inequalities," Computational Optimization and Applications, Springer, vol. 76(3), pages 935-959, July.
    12. Heinz H. Bauschke & Warren L. Hare & Walaa M. Moursi, 2016. "On the Range of the Douglas–Rachford Operator," Mathematics of Operations Research, INFORMS, vol. 41(3), pages 884-897, August.
    13. Hsien-Chung Wu, 2018. "Near Fixed Point Theorems in Hyperspaces," Mathematics, MDPI, vol. 6(6), pages 1-15, May.
    14. Walaa M. Moursi, 2018. "The Forward–Backward Algorithm and the Normal Problem," Journal of Optimization Theory and Applications, Springer, vol. 176(3), pages 605-624, March.
    15. Dawan Chumpungam & Panitarn Sarnmeta & Suthep Suantai, 2022. "An Accelerated Convex Optimization Algorithm with Line Search and Applications in Machine Learning," Mathematics, MDPI, vol. 10(9), pages 1-20, April.
    16. Yunier Bello-Cruz & Guoyin Li & Tran T. A. Nghia, 2021. "On the Linear Convergence of Forward–Backward Splitting Method: Part I—Convergence Analysis," Journal of Optimization Theory and Applications, Springer, vol. 188(2), pages 378-401, February.
    17. A. J. Zaslavski, 2014. "An Approximate Exact Penalty in Constrained Vector Optimization on Metric Spaces," Journal of Optimization Theory and Applications, Springer, vol. 162(2), pages 649-664, August.
    18. R. S. Burachik & A. N. Iusem & J. G. Melo, 2010. "Duality and Exact Penalization for General Augmented Lagrangians," Journal of Optimization Theory and Applications, Springer, vol. 147(1), pages 125-140, October.
    19. L. C. Ceng & B. S. Mordukhovich & J. C. Yao, 2010. "Hybrid Approximate Proximal Method with Auxiliary Variational Inequality for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 146(2), pages 267-303, August.
    20. J. Bello Cruz & A. Iusem, 2010. "Convergence of direct methods for paramonotone variational inequalities," Computational Optimization and Applications, Springer, vol. 46(2), pages 247-263, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:157:y:2013:i:1:d:10.1007_s10957-012-0158-7. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.