IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v200y2024i2d10.1007_s10957-023-02325-x.html
   My bibliography  Save this article

A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization

Author

Listed:
  • Zexian Liu

    (Guizhou University)

  • Yan Ni

    (Guizhou University)

  • Hongwei Liu

    (Xidian University)

  • Wumei Sun

    (Xi’an University of Science and Technology)

Abstract

Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization and have received increasing attention recently. The search directions of SMCG methods are generated by minimizing an approximate model with the approximate matrix $$B_k$$ B k over the two-dimensional subspace spanned by the current gradient $$g_k $$ g k and the latest step. The main drawback of SMCG methods is that the parameter $$g_k^TB_kg_k $$ g k T B k g k in the search directions must be determined when calculating the search directions. The parameter $$g_k^TB_kg_k $$ g k T B k g k is crucial to SMCG methods and is difficult to be determined properly. An alternative solution for this drawback might be to exploit a new way to derive SMCG methods independent of $$g_k^TB_kg_k$$ g k T B k g k . The projection technique has been used successfully to derive conjugate gradient directions such as the Dai–Kou conjugate gradient direction (Dai and Kou in SIAM J Optim 23(1):296–320, 2013). Motivated by the above two observations, we use a projection technique to derive a new SMCG method independent of $$g_k^TB_kg_k$$ g k T B k g k . More specifically, we project the search direction of memoryless quasi-Newton method into the above two-dimensional subspace and derive a new search direction, which is proved to be descent. Remarkably, the proposed method without any line search enjoys the finite termination property for two-dimensional strictly convex quadratic functions. An adaptive scaling factor in the search direction is exploited based on the finite termination property. The proposed method does not need to determine the parameter $$g_k^TB_kg_k$$ g k T B k g k and can be regarded as an extension of the Dai–Kou conjugate gradient method. The global convergence of the proposed method is established under the suitable assumptions. Numerical comparisons on the 147 test functions from the CUTEst library indicate that the proposed method is very promising.

Suggested Citation

  • Zexian Liu & Yan Ni & Hongwei Liu & Wumei Sun, 2024. "A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization," Journal of Optimization Theory and Applications, Springer, vol. 200(2), pages 820-851, February.
  • Handle: RePEc:spr:joptap:v:200:y:2024:i:2:d:10.1007_s10957-023-02325-x
    DOI: 10.1007/s10957-023-02325-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-023-02325-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-023-02325-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hongwei Liu & Zexian Liu, 2019. "An Efficient Barzilai–Borwein Conjugate Gradient Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 180(3), pages 879-906, March.
    2. Wumei Sun & Hongwei Liu & Zexian Liu, 2021. "A Class of Accelerated Subspace Minimization Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 190(3), pages 811-840, September.
    3. Neculai Andrei, 2020. "Nonlinear Conjugate Gradient Methods for Unconstrained Optimization," Springer Optimization and Its Applications, Springer, number 978-3-030-42950-8, April.
    4. Neculai Andrei, 2020. "General Convergence Results for Nonlinear Conjugate Gradient Methods," Springer Optimization and Its Applications, in: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, chapter 0, pages 89-123, Springer.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Nasiru Salihu & Poom Kumam & Aliyu Muhammed Awwal & Ibrahim Arzuka & Thidaporn Seangwattana, 2023. "A Structured Fletcher-Revees Spectral Conjugate Gradient Method for Unconstrained Optimization with Application in Robotic Model," SN Operations Research Forum, Springer, vol. 4(4), pages 1-25, December.
    2. Jamilu Yahaya & Poom Kumam & Mahmoud Muhammad Yahaya, 2025. "A New Hybrid Conjugate Gradient Method Based on a Convex Combination for Multiobjective Optimization," SN Operations Research Forum, Springer, vol. 6(2), pages 1-26, June.
    3. Wumei Sun & Hongwei Liu & Zexian Liu, 2021. "A Class of Accelerated Subspace Minimization Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 190(3), pages 811-840, September.
    4. Chen, Kangming & Fukuda, Ellen Hidemi & Sato, Hiroyuki, 2025. "Nonlinear conjugate gradient method for vector optimization on Riemannian manifolds with retraction and vector transport," Applied Mathematics and Computation, Elsevier, vol. 486(C).
    5. Salihu, Nasiru & Kumam, Poom & Sulaiman, Ibrahim Mohammed & Arzuka, Ibrahim & Kumam, Wiyada, 2024. "An efficient Newton-like conjugate gradient method with restart strategy and its application," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 226(C), pages 354-372.
    6. Abubakar, Auwal Bala & Kumam, Poom & Malik, Maulana & Ibrahim, Abdulkarim Hassan, 2022. "A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 201(C), pages 640-657.
    7. Nasiru Salihu & Poom Kumam & Aliyu Muhammed Awwal & Ibrahim Mohammed Sulaiman & Thidaporn Seangwattana, 2023. "The global convergence of spectral RMIL conjugate gradient method for unconstrained optimization with applications to robotic model and image recovery," PLOS ONE, Public Library of Science, vol. 18(3), pages 1-19, March.
    8. Tiantian Zhao & Wei Hong Yang, 2023. "A Nonlinear Conjugate Gradient Method Using Inexact First-Order Information," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 502-530, August.
    9. George Baravdish & Gabriel Eilertsen & Rym Jaroudi & B. Tomas Johansson & Lukáš Malý & Jonas Unger, 2024. "A Hybrid Sobolev Gradient Method for Learning NODEs," SN Operations Research Forum, Springer, vol. 5(4), pages 1-39, December.
    10. Zhu, Zhibin & Zhang, Dongdong & Wang, Shuo, 2020. "Two modified DY conjugate gradient methods for unconstrained optimization problems," Applied Mathematics and Computation, Elsevier, vol. 373(C).
    11. Deng, Huaijun & Liu, Linna & Fang, Jianyin & Qu, Boyang & Huang, Quanzhen, 2023. "A novel improved whale optimization algorithm for optimization problems with multi-strategy and hybrid algorithm," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 205(C), pages 794-817.
    12. Zhang, Zemian & Chen, Xuesong, 2021. "A conjugate gradient method for distributed optimal control problems with nonhomogeneous Helmholtz equation," Applied Mathematics and Computation, Elsevier, vol. 402(C).
    13. Vladimir Rakočević & Milena J. Petrović, 2022. "Comparative Analysis of Accelerated Models for Solving Unconstrained Optimization Problems with Application of Khan’s Hybrid Rule," Mathematics, MDPI, vol. 10(23), pages 1-13, November.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:200:y:2024:i:2:d:10.1007_s10957-023-02325-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.