IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v200y2024i2d10.1007_s10957-023-02325-x.html
   My bibliography  Save this article

A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization

Author

Listed:
  • Zexian Liu

    (Guizhou University)

  • Yan Ni

    (Guizhou University)

  • Hongwei Liu

    (Xidian University)

  • Wumei Sun

    (Xi’an University of Science and Technology)

Abstract

Subspace minimization conjugate gradient (SMCG) methods are a class of quite efficient iterative methods for unconstrained optimization and have received increasing attention recently. The search directions of SMCG methods are generated by minimizing an approximate model with the approximate matrix $$B_k$$ B k over the two-dimensional subspace spanned by the current gradient $$g_k $$ g k and the latest step. The main drawback of SMCG methods is that the parameter $$g_k^TB_kg_k $$ g k T B k g k in the search directions must be determined when calculating the search directions. The parameter $$g_k^TB_kg_k $$ g k T B k g k is crucial to SMCG methods and is difficult to be determined properly. An alternative solution for this drawback might be to exploit a new way to derive SMCG methods independent of $$g_k^TB_kg_k$$ g k T B k g k . The projection technique has been used successfully to derive conjugate gradient directions such as the Dai–Kou conjugate gradient direction (Dai and Kou in SIAM J Optim 23(1):296–320, 2013). Motivated by the above two observations, we use a projection technique to derive a new SMCG method independent of $$g_k^TB_kg_k$$ g k T B k g k . More specifically, we project the search direction of memoryless quasi-Newton method into the above two-dimensional subspace and derive a new search direction, which is proved to be descent. Remarkably, the proposed method without any line search enjoys the finite termination property for two-dimensional strictly convex quadratic functions. An adaptive scaling factor in the search direction is exploited based on the finite termination property. The proposed method does not need to determine the parameter $$g_k^TB_kg_k$$ g k T B k g k and can be regarded as an extension of the Dai–Kou conjugate gradient method. The global convergence of the proposed method is established under the suitable assumptions. Numerical comparisons on the 147 test functions from the CUTEst library indicate that the proposed method is very promising.

Suggested Citation

  • Zexian Liu & Yan Ni & Hongwei Liu & Wumei Sun, 2024. "A New Subspace Minimization Conjugate Gradient Method for Unconstrained Minimization," Journal of Optimization Theory and Applications, Springer, vol. 200(2), pages 820-851, February.
  • Handle: RePEc:spr:joptap:v:200:y:2024:i:2:d:10.1007_s10957-023-02325-x
    DOI: 10.1007/s10957-023-02325-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-023-02325-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-023-02325-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:200:y:2024:i:2:d:10.1007_s10957-023-02325-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.