IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v198y2023i2d10.1007_s10957-023-02245-w.html
   My bibliography  Save this article

Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient

Author

Listed:
  • Fedor Stonyakin

    (Moscow Institute of Physics and Technology
    V. I. Vernadsky Crimean Federal University)

  • Ilya Kuruzov

    (Moscow Institute of Physics and Technology
    Institute for Information Transmission Problems)

  • Boris Polyak

    (Moscow Institute of Physics and Technology
    Institute for Control Sciences)

Abstract

We study the gradient method under the assumption that an additively inexact gradient is available for, generally speaking, non-convex problems. The non-convexity of the objective function, as well as the use of an inexactness specified gradient at iterations, can lead to various problems. For example, the trajectory of the gradient method may be far enough away from the starting point. On the other hand, the unbounded removal of the trajectory of the gradient method in the presence of noise can lead to the removal of the trajectory of the method from the desired global solution. The results of investigating the behavior of the trajectory of the gradient method are obtained under the assumption of the inexactness of the gradient and the condition of gradient dominance. It is well known that such a condition is valid for many important non-convex problems. Moreover, it leads to good complexity guarantees for the gradient method. A rule of early stopping of the gradient method is proposed. Firstly, it guarantees achieving an acceptable quality of the exit point of the method in terms of the function. Secondly, the stopping rule ensures a fairly moderate distance of this point from the chosen initial position. In addition to the gradient method with a constant step, its variant with adaptive step size is also investigated in detail, which makes it possible to apply the developed technique in the case of an unknown Lipschitz constant for the gradient. Some computational experiments have been carried out which demonstrate effectiveness of the proposed stopping rule for the investigated gradient methods.

Suggested Citation

  • Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.
  • Handle: RePEc:spr:joptap:v:198:y:2023:i:2:d:10.1007_s10957-023-02245-w
    DOI: 10.1007/s10957-023-02245-w
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-023-02245-w
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-023-02245-w?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2011. "First-order methods of smooth convex optimization with inexact oracle," LIDAM Discussion Papers CORE 2011002, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. NESTEROV, Yurii & POLYAK, B.T., 2006. "Cubic regularization of Newton method and its global performance," LIDAM Reprints CORE 1927, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    2. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    3. Masoud Ahookhosh & Arnold Neumaier, 2018. "Solving structured nonsmooth convex optimization with complexity $$\mathcal {O}(\varepsilon ^{-1/2})$$ O ( ε - 1 / 2 )," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 26(1), pages 110-145, April.
    4. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    5. Anton Rodomanov & Yurii Nesterov, 2020. "Smoothness Parameter of Power of Euclidean Norm," Journal of Optimization Theory and Applications, Springer, vol. 185(2), pages 303-326, May.
    6. Silvia Berra & Alessandro Torraca & Federico Benvenuto & Sara Sommariva, 2024. "Combined Newton-Gradient Method for Constrained Root-Finding in Chemical Reaction Networks," Journal of Optimization Theory and Applications, Springer, vol. 200(1), pages 404-427, January.
    7. Ariizumi, Shumpei & Yamakawa, Yuya & Yamashita, Nobuo, 2024. "Convergence properties of Levenberg–Marquardt methods with generalized regularization terms," Applied Mathematics and Computation, Elsevier, vol. 463(C).
    8. Seonho Park & Seung Hyun Jung & Panos M. Pardalos, 2020. "Combining Stochastic Adaptive Cubic Regularization with Negative Curvature for Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 953-971, March.
    9. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    10. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    11. Xuexue Zhang & Sanyang Liu & Nannan Zhao, 2023. "An Extended Gradient Method for Smooth and Strongly Convex Functions," Mathematics, MDPI, vol. 11(23), pages 1-14, November.
    12. Kenji Ueda & Nobuo Yamashita, 2012. "Global Complexity Bound Analysis of the Levenberg–Marquardt Method for Nonsmooth Equations and Its Application to the Nonlinear Complementarity Problem," Journal of Optimization Theory and Applications, Springer, vol. 152(2), pages 450-467, February.
    13. DEVOLDER, Olivier & GLINEUR, François & NESTEROV, Yurii, 2013. "First-order methods with inexact oracle: the strongly convex case," LIDAM Discussion Papers CORE 2013016, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    14. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    15. Le Thi Khanh Hien & Cuong V. Nguyen & Huan Xu & Canyi Lu & Jiashi Feng, 2019. "Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 541-566, May.
    16. DEVOLDER, Olivier, 2011. "Stochastic first order methods in smooth convex optimization," LIDAM Discussion Papers CORE 2011070, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    17. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    18. Liaoyuan Zeng & Ting Kei Pong, 2022. "$$\rho$$ ρ -regularization subproblems: strong duality and an eigensolver-based algorithm," Computational Optimization and Applications, Springer, vol. 81(2), pages 337-368, March.
    19. Yuquan Chen & Yunkang Sun & Bing Wang, 2023. "Improving the Performance of Optimization Algorithms Using the Adaptive Fixed-Time Scheme and Reset Scheme," Mathematics, MDPI, vol. 11(22), pages 1-16, November.
    20. Yurii Nesterov, 2021. "Superfast Second-Order Methods for Unconstrained Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 191(1), pages 1-30, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:198:y:2023:i:2:d:10.1007_s10957-023-02245-w. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.