IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v194y2022i1d10.1007_s10957-022-02032-z.html
   My bibliography  Save this article

A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions

Author

Listed:
  • André Uschmajew

    (Max Planck Institute for Mathematics in the Sciences)

  • Bart Vandereycken

    (University of Geneva)

Abstract

Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, an elementary convergence analysis for general descent methods with fixed step sizes is presented. It covers general variable metric methods, gradient-related search directions under angle and scaling conditions, as well as inexact gradient methods. In all cases, optimal rates are obtained.

Suggested Citation

  • André Uschmajew & Bart Vandereycken, 2022. "A Note on the Optimal Convergence Rate of Descent Methods with Fixed Step Sizes for Smooth Strongly Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 364-373, July.
  • Handle: RePEc:spr:joptap:v:194:y:2022:i:1:d:10.1007_s10957-022-02032-z
    DOI: 10.1007/s10957-022-02032-z
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-022-02032-z
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-022-02032-z?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Henry Wolkowicz, 1994. "Measures for Symmetric Rank-One Updates," Mathematics of Operations Research, INFORMS, vol. 19(4), pages 815-830, November.
    2. De Klerk, Etienne & Glineur, François & Taylor, Adrien B., 2020. "Worst-Case Convergence Analysis of Inexact Gradient and Newton Methods Through Semidefinite Programming Performance Estimation," LIDAM Reprints CORE 3134, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. DE KLERK, Etienne & GLINEUR, François & TAYLOR, Adrien B., 2016. "On the Worst-case Complexity of the Gradient Method with Exact Line Search for Smooth Strongly Convex Functions," LIDAM Discussion Papers CORE 2016027, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    4. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sandra S. Y. Tan & Antonios Varvitsiotis & Vincent Y. F. Tan, 2021. "Analysis of Optimization Algorithms via Sum-of-Squares," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 56-81, July.
    2. Roland Hildebrand, 2021. "Optimal step length for the Newton method: case of self-concordant functions," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 94(2), pages 253-279, October.
    3. Abbaszadehpeivasti, Hadi & de Klerk, Etienne & Zamani, Moslem, 2022. "The exact worst-case convergence rate of the gradient method with fixed step lengths for L-smooth functions," Other publications TiSEM 061688c6-f97c-4024-bb5b-1, Tilburg University, School of Economics and Management.
    4. Donghwan Kim & Jeffrey A. Fessler, 2021. "Optimizing the Efficiency of First-Order Methods for Decreasing the Gradient of Smooth Convex Functions," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 192-219, January.
    5. Abbaszadehpeivasti, Hadi & de Klerk, Etienne & Zamani, Moslem, 2023. "Convergence rate analysis of randomized and cyclic coordinate descent for convex optimization through semidefinite programming," Other publications TiSEM 88512ac0-c26a-4a99-b840-3, Tilburg University, School of Economics and Management.
    6. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    7. Wei Peng & Hui Zhang & Xiaoya Zhang & Lizhi Cheng, 2020. "Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions," Journal of Global Optimization, Springer, vol. 78(1), pages 69-89, September.
    8. Johannes Brust & Jennifer B. Erway & Roummel F. Marcia, 2017. "On solving L-SR1 trust-region subproblems," Computational Optimization and Applications, Springer, vol. 66(2), pages 245-266, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:194:y:2022:i:1:d:10.1007_s10957-022-02032-z. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.