IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v92y2025i1d10.1007_s10589-025-00691-y.html
   My bibliography  Save this article

Adaptive cyclic gradient methods with interpolation

Author

Listed:
  • Yixin Xie

    (Beijing University of Posts and Telecommunications (BUPT)
    Key Laboratory of Mathematics and Information Networks (BUPT) Ministry of Education)

  • Cong Sun

    (Beijing University of Posts and Telecommunications (BUPT)
    Key Laboratory of Mathematics and Information Networks (BUPT) Ministry of Education)

  • Ya-Xiang Yuan

    (Institute of Computational Mathematics and Scientific/Engineering Computing, Academy of Mathematics and System Sciences, Chinese Academy of Sciences)

Abstract

Gradient method is an important method for solving large scale problems. In this paper, a new gradient method framework for unconstrained optimization problem is proposed, where the stepsize is updated in a cyclic way. The Cauchy step is approximated by the quadratic interpolation. And the cycle for stepsize update is adjusted adaptively. Combining with the adaptive nonmonotone line search technique, we prove the global convergence of the proposed method. Furthermore, its sublinear convergence rate for convex problems and R-linear convergence rate for problems with quadratic functional growth property are analyzed. Numerical results show that our proposed algorithm enjoys good performances in terms of both computational cost and obtained function values.

Suggested Citation

  • Yixin Xie & Cong Sun & Ya-Xiang Yuan, 2025. "Adaptive cyclic gradient methods with interpolation," Computational Optimization and Applications, Springer, vol. 92(1), pages 301-325, September.
  • Handle: RePEc:spr:coopap:v:92:y:2025:i:1:d:10.1007_s10589-025-00691-y
    DOI: 10.1007/s10589-025-00691-y
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-025-00691-y
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-025-00691-y?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Nicholas Gould & Dominique Orban & Philippe Toint, 2015. "CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization," Computational Optimization and Applications, Springer, vol. 60(3), pages 545-557, April.
    2. Ion Necoara & Yurii Nesterov & François Glineur, 2019. "Linear convergence of first order methods for non-strongly convex optimization," LIDAM Reprints CORE 3000, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    3. Roberta De Asmundis & Daniela di Serafino & William Hager & Gerardo Toraldo & Hongchao Zhang, 2014. "An efficient gradient method using the Yuan steplength," Computational Optimization and Applications, Springer, vol. 59(3), pages 541-563, December.
    4. Roberto Andreani & Marcos Raydan, 2021. "Properties of the delayed weighted gradient method," Computational Optimization and Applications, Springer, vol. 78(1), pages 167-180, January.
    5. di Serafino, Daniela & Toraldo, Gerardo & Viola, Marco, 2021. "Using gradient directions to get global convergence of Newton-type methods," Applied Mathematics and Computation, Elsevier, vol. 409(C).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Hugo Lara & Rafael Aleixo & Harry Oviedo, 2024. "Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization," Computational Optimization and Applications, Springer, vol. 89(1), pages 151-182, September.
    2. Silvia Berra & Alessandro Torraca & Federico Benvenuto & Sara Sommariva, 2024. "Combined Newton-Gradient Method for Constrained Root-Finding in Chemical Reaction Networks," Journal of Optimization Theory and Applications, Springer, vol. 200(1), pages 404-427, January.
    3. Zamani, Moslem & Abbaszadehpeivasti, Hadi & de Klerk, Etienne, 2024. "The exact worst-case convergence rate of the alternating direction method of multipliers," Other publications TiSEM f30ae9e6-ed19-423f-bd1e-0, Tilburg University, School of Economics and Management.
    4. Roberto Andreani & Marcos Raydan, 2021. "Properties of the delayed weighted gradient method," Computational Optimization and Applications, Springer, vol. 78(1), pages 167-180, January.
    5. David J. Eckman & Shane G. Henderson & Sara Shashaani, 2023. "Diagnostic Tools for Evaluating and Comparing Simulation-Optimization Algorithms," INFORMS Journal on Computing, INFORMS, vol. 35(2), pages 350-367, March.
    6. Brian Irwin & Eldad Haber, 2023. "Secant penalized BFGS: a noise robust quasi-Newton method via penalizing the secant condition," Computational Optimization and Applications, Springer, vol. 84(3), pages 651-702, April.
    7. Matteo Lapucci & Alessio Sortino, 2024. "On the Convergence of Inexact Alternate Minimization in Problems with $$\ell _0$$ ℓ 0 Penalties," SN Operations Research Forum, Springer, vol. 5(2), pages 1-11, June.
    8. S. Gratton & Ph. L. Toint, 2020. "A note on solving nonlinear optimization problems in variable precision," Computational Optimization and Applications, Springer, vol. 76(3), pages 917-933, July.
    9. Yassine Nabou & Ion Necoara, 2024. "Efficiency of higher-order algorithms for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 87(2), pages 441-473, March.
    10. Felipe Lara & Raúl T. Marcavillaca & Phan Tu Vuong, 2025. "Characterizations, Dynamical Systems and Gradient Methods for Strongly Quasiconvex Functions," Journal of Optimization Theory and Applications, Springer, vol. 206(3), pages 1-25, September.
    11. Lahcen El Bourkhissi & Ion Necoara, 2025. "Complexity of linearized quadratic penalty for optimization with nonlinear equality constraints," Journal of Global Optimization, Springer, vol. 91(3), pages 483-510, March.
    12. Yutao Zheng & Bing Zheng, 2017. "Two New Dai–Liao-Type Conjugate Gradient Methods for Unconstrained Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 175(2), pages 502-509, November.
    13. Giovanni Fasano & Massimo Roma, 2016. "A novel class of approximate inverse preconditioners for large positive definite linear systems in optimization," Computational Optimization and Applications, Springer, vol. 65(2), pages 399-429, November.
    14. Vassilis Apidopoulos & Nicolò Ginatta & Silvia Villa, 2022. "Convergence rates for the heavy-ball continuous dynamics for non-convex optimization, under Polyak–Łojasiewicz condition," Journal of Global Optimization, Springer, vol. 84(3), pages 563-589, November.
    15. Yonggang Pei & Shaofang Song & Detong Zhu, 2023. "A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization," Computational Optimization and Applications, Springer, vol. 84(3), pages 1005-1033, April.
    16. Juan José Maulén & Juan Peypouquet, 2023. "A Speed Restart Scheme for a Dynamics with Hessian-Driven Damping," Journal of Optimization Theory and Applications, Springer, vol. 199(2), pages 831-855, November.
    17. Chhavi Sharma & Vishnu Narayanan & P. Balamurugan, 2024. "Distributed accelerated gradient methods with restart under quadratic growth condition," Journal of Global Optimization, Springer, vol. 90(1), pages 153-215, September.
    18. Ching-pei Lee & Stephen J. Wright, 2019. "Inexact Successive quadratic approximation for regularized optimization," Computational Optimization and Applications, Springer, vol. 72(3), pages 641-674, April.
    19. Behzad Azmi & Marco Bernreuther, 2025. "On the forward–backward method with nonmonotone linesearch for infinite-dimensional nonsmooth nonconvex problems," Computational Optimization and Applications, Springer, vol. 91(3), pages 1263-1308, July.
    20. Bonettini, Silvia & Prato, Marco & Rebegoldi, Simone, 2016. "A cyclic block coordinate descent method with generalized gradient projections," Applied Mathematics and Computation, Elsevier, vol. 286(C), pages 288-300.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:92:y:2025:i:1:d:10.1007_s10589-025-00691-y. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.