IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v201y2024i1d10.1007_s10957-024-02394-6.html
   My bibliography  Save this article

A Universal Accelerated Primal–Dual Method for Convex Optimization Problems

Author

Listed:
  • Hao Luo

    (Chongqing Normal University
    Peking University)

Abstract

This work presents a universal accelerated primal–dual method for affinely constrained convex optimization problems. It can handle both Lipschitz and Hölder gradients but does not need to know the smoothness level of the objective function. In line search part, it uses dynamically decreasing parameters and produces approximate Lipschitz constant with moderate magnitude. In addition, based on a suitable discrete Lyapunov function and tight decay estimates of some differential/difference inequalities, a universal optimal mixed-type convergence rate is established. Some numerical tests are provided to confirm the efficiency of the proposed method.

Suggested Citation

  • Hao Luo, 2024. "A Universal Accelerated Primal–Dual Method for Convex Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 201(1), pages 280-312, April.
  • Handle: RePEc:spr:joptap:v:201:y:2024:i:1:d:10.1007_s10957-024-02394-6
    DOI: 10.1007/s10957-024-02394-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-024-02394-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-024-02394-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yurii, 2013. "Gradient methods for minimizing composite functions," LIDAM Reprints CORE 2510, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Myeongmin Kang & Myungjoo Kang & Miyoun Jung, 2015. "Inexact accelerated augmented Lagrangian methods," Computational Optimization and Applications, Springer, vol. 62(2), pages 373-404, November.
    3. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    5. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    2. Alkousa, Mohammad & Stonyakin, Fedor & Gasnikov, Alexander & Abdo, Asmaa & Alcheikh, Mohammad, 2024. "Higher degree inexact model for optimization problems," Chaos, Solitons & Fractals, Elsevier, vol. 186(C).
    3. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
    4. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    5. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    6. Le Thi Khanh Hien & Cuong V. Nguyen & Huan Xu & Canyi Lu & Jiashi Feng, 2019. "Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 541-566, May.
    7. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    8. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    9. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
    10. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
    11. Hedy Attouch & Zaki Chbani & Jalal Fadili & Hassan Riahi, 2022. "Fast Convergence of Dynamical ADMM via Time Scaling of Damped Inertial Dynamics," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 704-736, June.
    12. Masaru Ito & Mituhiro Fukuda, 2021. "Nearly Optimal First-Order Methods for Convex Optimization under Gradient Norm Measure: an Adaptive Regularization Approach," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 770-804, March.
    13. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    14. Chin Pang Ho & Panos Parpas, 2019. "Empirical risk minimization: probabilistic complexity and stepsize strategy," Computational Optimization and Applications, Springer, vol. 73(2), pages 387-410, June.
    15. Klimza, Anton & Gasnikov, Alexander & Stonyakin, Fedor & Alkousa, Mohammad, 2024. "Universal methods for variational inequalities: Deterministic and stochastic cases," Chaos, Solitons & Fractals, Elsevier, vol. 187(C).
    16. Eduard Gorbunov & Marina Danilova & Innokentiy Shibaev & Pavel Dvurechensky & Alexander Gasnikov, 2024. "High-Probability Complexity Bounds for Non-smooth Stochastic Convex Optimization with Heavy-Tailed Noise," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2679-2738, December.
    17. Pavel Dvurechensky & Alexander Gasnikov, 2016. "Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 171(1), pages 121-145, October.
    18. Stefania Bellavia & Gianmarco Gurioli & Benedetta Morini & Philippe Louis Toint, 2023. "The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case," Journal of Optimization Theory and Applications, Springer, vol. 196(2), pages 700-729, February.
    19. Guillaume O. Berger & P.-A. Absil & Raphaël M. Jungers & Yurii Nesterov, 2020. "On the Quality of First-Order Approximation of Functions with Hölder Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 185(1), pages 17-33, April.
    20. Filip Hanzely & Peter Richtárik & Lin Xiao, 2021. "Accelerated Bregman proximal gradient methods for relatively smooth convex optimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 405-440, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:201:y:2024:i:1:d:10.1007_s10957-024-02394-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.