IDEAS home Printed from https://ideas.repec.org/a/spr/snopef/v6y2025i2d10.1007_s43069-025-00484-3.html
   My bibliography  Save this article

A New Hybrid Conjugate Gradient Method Based on a Convex Combination for Multiobjective Optimization

Author

Listed:
  • Jamilu Yahaya

    (King Mongkut’s University of Technology Thonburi (KMUTT)
    Ahmadu Bello University)

  • Poom Kumam

    (King Mongkut’s University of Technology Thonburi (KMUTT)
    King Mongkut’s University of Technology Thonburi (KMUTT))

  • Mahmoud Muhammad Yahaya

    (King Mongkut’s University of Technology Thonburi (KMUTT))

Abstract

Conjugate gradient methods are a crucial class of techniques for solving unconstrained optimization problems. While the Hestenes-Stiefel and other conjugate gradient methods have recently been extended to multiobjective optimization setting-notably the Hestene-Stiefel failed to ensure descent direction. In contrast, the Conjugate Descent method consistently ensures a descent direction in this context. In this paper, we introduce a hybrid conjugate gradient method that combines a modified Hestenes-Stiefel method with the Conjugate Descent method through a convex combination. The convex coefficient parameter is derived using the Dai-Liao conjugacy condition and the condition for Newton’s direction. Crucially, this hybrid approach guarantees sufficient descent property under the Wolfe line search conditions while establishing global convergence under mild assumptions-without necessarily requiring an algorithmic restarts or assuming convexity on the objective functions. Preliminary numerical results demonstrate the clear advantages of this hybrid method over some existing conjugate gradient techniques.

Suggested Citation

  • Jamilu Yahaya & Poom Kumam & Mahmoud Muhammad Yahaya, 2025. "A New Hybrid Conjugate Gradient Method Based on a Convex Combination for Multiobjective Optimization," SN Operations Research Forum, Springer, vol. 6(2), pages 1-26, June.
  • Handle: RePEc:spr:snopef:v:6:y:2025:i:2:d:10.1007_s43069-025-00484-3
    DOI: 10.1007/s43069-025-00484-3
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s43069-025-00484-3
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s43069-025-00484-3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Gonçalves, M.L.N. & Lima, F.S. & Prudente, L.F., 2022. "A study of Liu-Storey conjugate gradient methods for vector optimization," Applied Mathematics and Computation, Elsevier, vol. 425(C).
    2. M. L. N. Gonçalves & F. S. Lima & L. F. Prudente, 2022. "Globally convergent Newton-type methods for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 83(2), pages 403-434, November.
    3. J. Fliege & L. N. Vicente, 2006. "Multicriteria Approach to Bilevel Optimization," Journal of Optimization Theory and Applications, Springer, vol. 131(2), pages 209-225, November.
    4. Miglierina, E. & Molho, E. & Recchioni, M.C., 2008. "Box-constrained multi-objective optimization: A gradient-like method without "a priori" scalarization," European Journal of Operational Research, Elsevier, vol. 188(3), pages 662-682, August.
    5. Neculai Andrei, 2022. "Modern Numerical Nonlinear Optimization," Springer Optimization and Its Applications, Springer, number 978-3-031-08720-2, December.
    6. White, D.J., 1998. "Epsilon-dominating solutions in mean-variance portfolio analysis," European Journal of Operational Research, Elsevier, vol. 105(3), pages 457-466, March.
    7. Qingjie Hu & Liping Zhu & Yu Chen, 2024. "Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 88(1), pages 217-250, May.
    8. N. Andrei, 2009. "Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 141(2), pages 249-264, May.
    9. Jörg Fliege & Benar Fux Svaiter, 2000. "Steepest descent methods for multicriteria optimization," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 51(3), pages 479-494, August.
    10. Johannes Jahn & Andreas Kirsch & Carmen Wagner, 2004. "Optimization of rod antennas of mobile phones," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 59(1), pages 37-51, February.
    11. M. L. N. Gonçalves & L. F. Prudente, 2020. "On the extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 76(3), pages 889-916, July.
    12. Jamilu Yahaya & Poom Kumam & Sani Salisu & Kanokwan Sitthithakerngkiet, 2024. "Spectral-like conjugate gradient methods with sufficient descent property for vector optimization," PLOS ONE, Public Library of Science, vol. 19(5), pages 1-22, May.
    13. C. Hillermeier, 2001. "Generalized Homotopy Approach to Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 110(3), pages 557-583, September.
    14. Thai Chuong, 2013. "Newton-like methods for efficient solutions in vector optimization," Computational Optimization and Applications, Springer, vol. 54(3), pages 495-516, April.
    15. L. F. Prudente & D. R. Souza, 2022. "A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 1107-1140, September.
    16. Neculai Andrei, 2020. "Nonlinear Conjugate Gradient Methods for Unconstrained Optimization," Springer Optimization and Its Applications, Springer, number 978-3-030-42950-8, December.
    17. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    18. Abubakar, Auwal Bala & Kumam, Poom & Malik, Maulana & Ibrahim, Abdulkarim Hassan, 2022. "A hybrid conjugate gradient based approach for solving unconstrained optimization and motion control problems," Mathematics and Computers in Simulation (MATCOM), Elsevier, vol. 201(C), pages 640-657.
    19. Neculai Andrei, 2020. "General Convergence Results for Nonlinear Conjugate Gradient Methods," Springer Optimization and Its Applications, in: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, chapter 0, pages 89-123, Springer.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Qingjie Hu & Ruyun Li & Yanyan Zhang & Zhibin Zhu, 2024. "On the Extension of Dai-Liao Conjugate Gradient Method for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(1), pages 810-843, October.
    2. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    3. Qing-Rui He & Sheng-Jie Li & Bo-Ya Zhang & Chun-Rong Chen, 2024. "A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 805-842, December.
    4. Matteo Lapucci & Pierluigi Mansueto, 2023. "A limited memory Quasi-Newton approach for multi-objective optimization," Computational Optimization and Applications, Springer, vol. 85(1), pages 33-73, May.
    5. Qingjie Hu & Liping Zhu & Yu Chen, 2024. "Alternative extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 88(1), pages 217-250, May.
    6. Gonçalves, M.L.N. & Lima, F.S. & Prudente, L.F., 2022. "A study of Liu-Storey conjugate gradient methods for vector optimization," Applied Mathematics and Computation, Elsevier, vol. 425(C).
    7. M. L. N. Gonçalves & L. F. Prudente, 2020. "On the extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 76(3), pages 889-916, July.
    8. L. F. Prudente & D. R. Souza, 2022. "A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 1107-1140, September.
    9. L. F. Prudente & D. R. Souza, 2024. "Global convergence of a BFGS-type algorithm for nonconvex multiobjective optimization problems," Computational Optimization and Applications, Springer, vol. 88(3), pages 719-757, July.
    10. Jiawei Chen & Yushan Bai & Guolin Yu & Xiaoqing Ou & Xiaolong Qin, 2025. "A PRP Type Conjugate Gradient Method Without Truncation for Nonconvex Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 204(1), pages 1-30, January.
    11. M. L. N. Gonçalves & F. S. Lima & L. F. Prudente, 2022. "Globally convergent Newton-type methods for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 83(2), pages 403-434, November.
    12. P. B. Assunção & O. P. Ferreira & L. F. Prudente, 2021. "Conditional gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 78(3), pages 741-768, April.
    13. Chen, Wang & Yang, Xinmin & Zhao, Yong, 2023. "Memory gradient method for multiobjective optimization," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    14. Jamilu Yahaya & Poom Kumam & Sani Salisu & Kanokwan Sitthithakerngkiet, 2024. "Spectral-like conjugate gradient methods with sufficient descent property for vector optimization," PLOS ONE, Public Library of Science, vol. 19(5), pages 1-22, May.
    15. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    16. Nasiru Salihu & Poom Kumam & Aliyu Muhammed Awwal & Ibrahim Arzuka & Thidaporn Seangwattana, 2023. "A Structured Fletcher-Revees Spectral Conjugate Gradient Method for Unconstrained Optimization with Application in Robotic Model," SN Operations Research Forum, Springer, vol. 4(4), pages 1-25, December.
    17. Miglierina, E. & Molho, E. & Recchioni, M.C., 2008. "Box-constrained multi-objective optimization: A gradient-like method without "a priori" scalarization," European Journal of Operational Research, Elsevier, vol. 188(3), pages 662-682, August.
    18. Wumei Sun & Hongwei Liu & Zexian Liu, 2021. "A Class of Accelerated Subspace Minimization Conjugate Gradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 190(3), pages 811-840, September.
    19. G. Cocchi & M. Lapucci, 2020. "An augmented Lagrangian algorithm for multi-objective optimization," Computational Optimization and Applications, Springer, vol. 77(1), pages 29-56, September.
    20. Xiaopeng Zhao & Jen-Chih Yao, 2022. "Linear convergence of a nonmonotone projected gradient method for multiobjective optimization," Journal of Global Optimization, Springer, vol. 82(3), pages 577-594, March.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:snopef:v:6:y:2025:i:2:d:10.1007_s43069-025-00484-3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.