IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v91y2025i1d10.1007_s10589-025-00670-3.html
   My bibliography  Save this article

On a family of relaxed gradient descent methods for strictly convex quadratic minimization

Author

Listed:
  • Liam MacDonald

    (University of Canterbury)

  • Rua Murray

    (University of Canterbury)

  • Rachael Tappenden

    (University of Canterbury)

Abstract

This paper studies the convergence properties of a family of Relaxed $$\ell $$ ℓ -Minimal Gradient Descent methods for quadratic optimization; the family includes the omnipresent Steepest Descent method, as well as the Minimal Gradient method. Simple proofs are provided that show, in an appropriately chosen norm, the gradient and the distance of the iterates from optimality converge linearly, for all members of the family. Moreover, the function values decrease linearly, and iteration complexity results are provided. All theoretical results hold when (fixed) relaxation is employed. It is also shown that, given a fixed overhead and storage budget, every Relaxed $$\ell $$ ℓ -Minimal Gradient Descent method can be implemented using exactly one matrix vector product. Numerical experiments are presented that illustrate the benefits of relaxation across the family.

Suggested Citation

  • Liam MacDonald & Rua Murray & Rachael Tappenden, 2025. "On a family of relaxed gradient descent methods for strictly convex quadratic minimization," Computational Optimization and Applications, Springer, vol. 91(1), pages 173-200, May.
  • Handle: RePEc:spr:coopap:v:91:y:2025:i:1:d:10.1007_s10589-025-00670-3
    DOI: 10.1007/s10589-025-00670-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-025-00670-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-025-00670-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Roberta De Asmundis & Daniela di Serafino & William Hager & Gerardo Toraldo & Hongchao Zhang, 2014. "An efficient gradient method using the Yuan steplength," Computational Optimization and Applications, Springer, vol. 59(3), pages 541-563, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Roberto Andreani & Marcos Raydan, 2021. "Properties of the delayed weighted gradient method," Computational Optimization and Applications, Springer, vol. 78(1), pages 167-180, January.
    2. Bonettini, Silvia & Prato, Marco & Rebegoldi, Simone, 2016. "A cyclic block coordinate descent method with generalized gradient projections," Applied Mathematics and Computation, Elsevier, vol. 286(C), pages 288-300.
    3. Serena Crisci & Federica Porta & Valeria Ruggiero & Luca Zanni, 2023. "Hybrid limited memory gradient projection methods for box-constrained optimization problems," Computational Optimization and Applications, Springer, vol. 84(1), pages 151-189, January.
    4. Stefania Corsaro & Valentina Simone, 2019. "Adaptive $$l_1$$ l 1 -regularization for short-selling control in portfolio selection," Computational Optimization and Applications, Springer, vol. 72(2), pages 457-478, March.
    5. Corsaro, Stefania & De Simone, Valentina & Marino, Zelda, 2021. "Split Bregman iteration for multi-period mean variance portfolio optimization," Applied Mathematics and Computation, Elsevier, vol. 392(C).
    6. Yakui Huang & Yu-Hong Dai & Xin-Wei Liu & Hongchao Zhang, 2022. "On the acceleration of the Barzilai–Borwein method," Computational Optimization and Applications, Springer, vol. 81(3), pages 717-740, April.
    7. di Serafino, Daniela & Ruggiero, Valeria & Toraldo, Gerardo & Zanni, Luca, 2018. "On the steplength selection in gradient methods for unconstrained optimization," Applied Mathematics and Computation, Elsevier, vol. 318(C), pages 176-195.
    8. Na Huang, 2022. "On R-linear convergence analysis for a class of gradient methods," Computational Optimization and Applications, Springer, vol. 81(1), pages 161-177, January.
    9. Marco Viola & Mara Sangiovanni & Gerardo Toraldo & Mario R. Guarracino, 2019. "Semi-supervised generalized eigenvalues classification," Annals of Operations Research, Springer, vol. 276(1), pages 249-266, May.
    10. Masoud Fatemi, 2022. "On initial point selection of the steepest descent algorithm for general quadratic functions," Computational Optimization and Applications, Springer, vol. 82(2), pages 329-360, June.
    11. Clóvis Gonzaga & Ruana Schneider, 2016. "On the steepest descent algorithm for quadratic functions," Computational Optimization and Applications, Springer, vol. 63(2), pages 523-542, March.
    12. Stefania Corsaro & Valentina De Simone & Zelda Marino, 2021. "Fused Lasso approach in portfolio selection," Annals of Operations Research, Springer, vol. 299(1), pages 47-59, April.
    13. Tianji Wang & Qingdao Huang, 2025. "Research on Three-Dimensional Extension of Barzilai-Borwein-like Method," Mathematics, MDPI, vol. 13(2), pages 1-26, January.
    14. Crisci, Serena & Ruggiero, Valeria & Zanni, Luca, 2019. "Steplength selection in gradient projection methods for box-constrained quadratic programs," Applied Mathematics and Computation, Elsevier, vol. 356(C), pages 312-327.
    15. E. Loli Piccolomini & V. L. Coli & E. Morotti & L. Zanni, 2018. "Reconstruction of 3D X-ray CT images from reduced sampling by a scaled gradient projection algorithm," Computational Optimization and Applications, Springer, vol. 71(1), pages 171-191, September.
    16. Hugo Lara & Rafael Aleixo & Harry Oviedo, 2024. "Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization," Computational Optimization and Applications, Springer, vol. 89(1), pages 151-182, September.
    17. Harry Fernando Oviedo Leon, 2019. "A delayed weighted gradient method for strictly convex quadratic minimization," Computational Optimization and Applications, Springer, vol. 74(3), pages 729-746, December.
    18. Yu-Hong Dai & Yakui Huang & Xin-Wei Liu, 2019. "A family of spectral gradient methods for optimization," Computational Optimization and Applications, Springer, vol. 74(1), pages 43-65, September.
    19. Behzad Azmi & Karl Kunisch, 2020. "Analysis of the Barzilai-Borwein Step-Sizes for Problems in Hilbert Spaces," Journal of Optimization Theory and Applications, Springer, vol. 185(3), pages 819-844, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:91:y:2025:i:1:d:10.1007_s10589-025-00670-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.