IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v208y2026i1d10.1007_s10957-025-02815-0.html
   My bibliography  Save this article

A Three-Term Conjugate Gradient-Type Method with Sufficient Descent Property for Vector Optimization

Author

Listed:
  • Yu Chen

    (Guangxi Normal University)

  • Helong Chen

    (Guangxi Normal University)

  • Zhibin Zhu

    (Guilin University of Electronic Technology)

Abstract

The field of vector optimization represents a critical domain within the broader spectrum of optimization problems. Extensive research efforts are currently dedicated to developing solution methods for vector optimization problems. A range of classical approaches, originally designed for scalar optimization, have been adapted to address issues in vector optimization. These include techniques such as the steepest descent method, Newton’s method, quasi-Newton method and conjugate gradient method, among others. However, limited attention has been given to the three-term conjugate gradient method in the context of vector optimization. In this paper, based on the modified self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno (BFGS) method proposed by Kou and Dai (J. Optim. Theory Appl., 165(1): 209-224, 2015), we propose a novel three-term conjugate gradient-type method specifically designed for vector optimization problems. This method ensures the sufficient descent property independent of any line search strategy. Furthermore, the improved Wolfe line search is extended to vector optimization. The global convergence of the proposed method under the improved Wolfe line search is analyzed, demonstrating that at least one accumulation point of the sequence generated by the proposed algorithm is a K-critical point of vector optimization problem. Numerical experiments conducted on a set of benchmark test problems highlight the effectiveness of the proposed method compared to some existing gradient-based approaches.

Suggested Citation

  • Yu Chen & Helong Chen & Zhibin Zhu, 2026. "A Three-Term Conjugate Gradient-Type Method with Sufficient Descent Property for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 208(1), pages 1-42, January.
  • Handle: RePEc:spr:joptap:v:208:y:2026:i:1:d:10.1007_s10957-025-02815-0
    DOI: 10.1007/s10957-025-02815-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-025-02815-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-025-02815-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2023. "An accelerated proximal gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 86(2), pages 421-455, November.
    2. Matteo Lapucci & Pierluigi Mansueto, 2023. "A limited memory Quasi-Newton approach for multi-objective optimization," Computational Optimization and Applications, Springer, vol. 85(1), pages 33-73, May.
    3. Ellen Fukuda & L. Graña Drummond, 2013. "Inexact projected gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 54(3), pages 473-493, April.
    4. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    5. Miglierina, E. & Molho, E. & Recchioni, M.C., 2008. "Box-constrained multi-objective optimization: A gradient-like method without "a priori" scalarization," European Journal of Operational Research, Elsevier, vol. 188(3), pages 662-682, August.
    6. P. B. Assunção & O. P. Ferreira & L. F. Prudente, 2021. "Conditional gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 78(3), pages 741-768, April.
    7. Kanako Mita & Ellen H. Fukuda & Nobuo Yamashita, 2019. "Nonmonotone line searches for unconstrained multiobjective optimization problems," Journal of Global Optimization, Springer, vol. 75(1), pages 63-90, September.
    8. Gravel, Marc & Martel, Jean Marc & Nadeau, Raymond & Price, Wilson & Tremblay, Richard, 1992. "A multicriterion view of optimal resource allocation in job-shop production," European Journal of Operational Research, Elsevier, vol. 61(1-2), pages 230-244, August.
    9. Qingjie Hu & Liping Zhu & Yu Chen, 2024. "Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 88(1), pages 217-250, May.
    10. Chen, Wang & Yang, Xinmin & Zhao, Yong, 2023. "Memory gradient method for multiobjective optimization," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    11. Johannes Jahn & Andreas Kirsch & Carmen Wagner, 2004. "Optimization of rod antennas of mobile phones," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 59(1), pages 37-51, February.
    12. M. L. N. Gonçalves & L. F. Prudente, 2020. "On the extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 76(3), pages 889-916, July.
    13. Ceng, Lu-Chuan & Yao, Jen-Chih, 2007. "Approximate proximal methods in vector optimization," European Journal of Operational Research, Elsevier, vol. 183(1), pages 1-19, November.
    14. Qingjie Hu & Ruyun Li & Yanyan Zhang & Zhibin Zhu, 2024. "On the Extension of Dai-Liao Conjugate Gradient Method for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(1), pages 810-843, October.
    15. Qing-Rui He & Sheng-Jie Li & Bo-Ya Zhang & Chun-Rong Chen, 2024. "A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 805-842, December.
    16. Jing-jing Wang & Li-ping Tang & Xin-min Yang, 2024. "Spectral projected subgradient method with a 1-memory momentum term for constrained multiobjective optimization problem," Journal of Global Optimization, Springer, vol. 89(2), pages 277-302, June.
    17. C. Hillermeier, 2001. "Generalized Homotopy Approach to Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 110(3), pages 557-583, September.
    18. Mustapha El Moudden & Abdelkrim El Mouatasim, 2021. "Accelerated Diagonal Steepest Descent Method for Unconstrained Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 220-242, January.
    19. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    20. Thai Chuong, 2013. "Newton-like methods for efficient solutions in vector optimization," Computational Optimization and Applications, Springer, vol. 54(3), pages 495-516, April.
    21. Mustapha El Moudden & Ahmed El Ghali, 2018. "A new reduced gradient method for solving linearly constrained multiobjective optimization problems," Computational Optimization and Applications, Springer, vol. 71(3), pages 719-741, December.
    22. Avinoam Perry, 1977. "A Class of Conjugate Gradient Algorithms with a Two-Step Variable Metric Memory," Discussion Papers 269, Northwestern University, Center for Mathematical Studies in Economics and Management Science.
    23. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    24. Andrei, Neculai, 2010. "Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization," European Journal of Operational Research, Elsevier, vol. 204(3), pages 410-420, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Qing-Rui He & Sheng-Jie Li & Bo-Ya Zhang & Chun-Rong Chen, 2024. "A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 805-842, December.
    2. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    3. Qingjie Hu & Ruyun Li & Yanyan Zhang & Zhibin Zhu, 2024. "On the Extension of Dai-Liao Conjugate Gradient Method for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(1), pages 810-843, October.
    4. Jamilu Yahaya & Poom Kumam & Mahmoud Muhammad Yahaya, 2025. "A New Hybrid Conjugate Gradient Method Based on a Convex Combination for Multiobjective Optimization," SN Operations Research Forum, Springer, vol. 6(2), pages 1-26, June.
    5. Gonçalves, M.L.N. & Lima, F.S. & Prudente, L.F., 2022. "A study of Liu-Storey conjugate gradient methods for vector optimization," Applied Mathematics and Computation, Elsevier, vol. 425(C).
    6. Qingjie Hu & Liping Zhu & Yu Chen, 2024. "Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 88(1), pages 217-250, May.
    7. M. L. N. Gonçalves & L. F. Prudente, 2020. "On the extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 76(3), pages 889-916, July.
    8. L. F. Prudente & D. R. Souza, 2022. "A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 1107-1140, September.
    9. Jian Chen & Liping Tang & Xinmin Yang, 2025. "A Subspace Minimization Barzilai-Borwein Method for Multiobjective Optimization Problems," Computational Optimization and Applications, Springer, vol. 92(1), pages 155-178, September.
    10. L. F. Prudente & D. R. Souza, 2024. "Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems," Computational Optimization and Applications, Springer, vol. 88(3), pages 719-757, July.
    11. Jiawei Chen & Yushan Bai & Guolin Yu & Xiaoqing Ou & Xiaolong Qin, 2025. "A PRP Type Conjugate Gradient Method Without Truncation for Nonconvex Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 204(1), pages 1-30, January.
    12. M. L. N. Gonçalves & F. S. Lima & L. F. Prudente, 2022. "Globally convergent Newton-type methods for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 83(2), pages 403-434, November.
    13. P. B. Assunção & O. P. Ferreira & L. F. Prudente, 2021. "Conditional gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 78(3), pages 741-768, April.
    14. Wang Chen & Liping Tang & Xinmin Yang, 2025. "Generalized Conditional Gradient Methods for Multiobjective Composite Optimization Problems with Hölder Condition," Journal of Optimization Theory and Applications, Springer, vol. 206(3), pages 1-27, September.
    15. Jian Chen & Wang Chen & Liping Tang & Xinmin Yang, 2026. "Preconditioned Barzilai-Borwein Methods for Multiobjective Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 208(1), pages 1-43, January.
    16. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    17. Chen, Wang & Yang, Xinmin & Zhao, Yong, 2023. "Memory gradient method for multiobjective optimization," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    18. G. Cocchi & M. Lapucci, 2020. "An augmented Lagrangian algorithm for multi-objective optimization," Computational Optimization and Applications, Springer, vol. 77(1), pages 29-56, September.
    19. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2023. "An accelerated proximal gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 86(2), pages 421-455, November.
    20. Jamilu Yahaya & Poom Kumam & Sani Salisu & Kanokwan Sitthithakerngkiet, 2024. "Spectral-like conjugate gradient methods with sufficient descent property for vector optimization," PLOS ONE, Public Library of Science, vol. 19(5), pages 1-22, May.

    More about this item

    Keywords

    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:208:y:2026:i:1:d:10.1007_s10957-025-02815-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.