IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v89y2024i3d10.1007_s10589-024-00609-0.html
   My bibliography  Save this article

A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization

Author

Listed:
  • Qing-Rui He

    (Chongqing University)

  • Sheng-Jie Li

    (Chongqing University
    Key Laboratory of Nonlinear Analysis and its Applications (Chongqing University), Ministry of Education)

  • Bo-Ya Zhang

    (Chongqing University)

  • Chun-Rong Chen

    (Chongqing University
    Key Laboratory of Nonlinear Analysis and its Applications (Chongqing University), Ministry of Education)

Abstract

In this paper, we seek a new modification way to ensure the positiveness of the conjugate parameter and, based on the Dai-Yuan (DY) method in the vector setting, propose an associated family of conjugate gradient (CG) methods with guaranteed descent for solving unconstrained vector optimization problems. Several special members of the family are analyzed and the (sufficient) descent condition is established for them (in the vector sense). Under mild conditions, a general convergence result for the CG methods with specific parameters is presented, which, in particular, covers the global convergence of the aforementioned members. Furthermore, for the purpose of comparison, we then consider the direct extension versions of some Dai-Yuan type methods which are obtained by modifying the DY method of the scalar case. These vector extensions can retrieve the classical parameters in the scalar minimization case and their descent property and global convergence are also studied under mild assumptions. Finally, numerical experiments are given to illustrate the practical behavior of all proposed methods.

Suggested Citation

  • Qing-Rui He & Sheng-Jie Li & Bo-Ya Zhang & Chun-Rong Chen, 2024. "A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 805-842, December.
  • Handle: RePEc:spr:coopap:v:89:y:2024:i:3:d:10.1007_s10589-024-00609-0
    DOI: 10.1007/s10589-024-00609-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-024-00609-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-024-00609-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2019. "Proximal gradient methods for multiobjective optimization and their applications," Computational Optimization and Applications, Springer, vol. 72(2), pages 339-361, March.
    2. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2023. "An accelerated proximal gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 86(2), pages 421-455, November.
    3. Matteo Lapucci & Pierluigi Mansueto, 2023. "A limited memory Quasi-Newton approach for multi-objective optimization," Computational Optimization and Applications, Springer, vol. 85(1), pages 33-73, May.
    4. Zhu, Zhibin & Zhang, Dongdong & Wang, Shuo, 2020. "Two modified DY conjugate gradient methods for unconstrained optimization problems," Applied Mathematics and Computation, Elsevier, vol. 373(C).
    5. Ceng, Lu-Chuan & Yao, Jen-Chih, 2007. "Approximate proximal methods in vector optimization," European Journal of Operational Research, Elsevier, vol. 183(1), pages 1-19, November.
    6. Mustapha El Moudden & Abdelkrim El Mouatasim, 2021. "Accelerated Diagonal Steepest Descent Method for Unconstrained Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 188(1), pages 220-242, January.
    7. Thai Chuong, 2013. "Newton-like methods for efficient solutions in vector optimization," Computational Optimization and Applications, Springer, vol. 54(3), pages 495-516, April.
    8. Miglierina, E. & Molho, E. & Recchioni, M.C., 2008. "Box-constrained multi-objective optimization: A gradient-like method without "a priori" scalarization," European Journal of Operational Research, Elsevier, vol. 188(3), pages 662-682, August.
    9. Kanako Mita & Ellen H. Fukuda & Nobuo Yamashita, 2019. "Nonmonotone line searches for unconstrained multiobjective optimization problems," Journal of Global Optimization, Springer, vol. 75(1), pages 63-90, September.
    10. M. L. N. Gonçalves & L. F. Prudente, 2020. "On the extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 76(3), pages 889-916, July.
    11. Morovati, Vahid & Pourkarimi, Latif, 2019. "Extension of Zoutendijk method for solving constrained multiobjective optimization problems," European Journal of Operational Research, Elsevier, vol. 273(1), pages 44-57.
    12. C. Hillermeier, 2001. "Generalized Homotopy Approach to Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 110(3), pages 557-583, September.
    13. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    14. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Qing-Rui He & Chun-Rong Chen & Sheng-Jie Li, 2023. "Spectral conjugate gradient methods for vector optimization problems," Computational Optimization and Applications, Springer, vol. 86(2), pages 457-489, November.
    2. Qingjie Hu & Ruyun Li & Yanyan Zhang & Zhibin Zhu, 2024. "On the Extension of Dai-Liao Conjugate Gradient Method for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(1), pages 810-843, October.
    3. M. L. N. Gonçalves & L. F. Prudente, 2020. "On the extension of the Hager–Zhang conjugate gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 76(3), pages 889-916, July.
    4. L. F. Prudente & D. R. Souza, 2022. "A Quasi-Newton Method with Wolfe Line Searches for Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 1107-1140, September.
    5. Gonçalves, M.L.N. & Lima, F.S. & Prudente, L.F., 2022. "A study of Liu-Storey conjugate gradient methods for vector optimization," Applied Mathematics and Computation, Elsevier, vol. 425(C).
    6. M. L. N. Gonçalves & F. S. Lima & L. F. Prudente, 2022. "Globally convergent Newton-type methods for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 83(2), pages 403-434, November.
    7. P. B. Assunção & O. P. Ferreira & L. F. Prudente, 2021. "Conditional gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 78(3), pages 741-768, April.
    8. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2023. "An accelerated proximal gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 86(2), pages 421-455, November.
    9. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    10. Chen, Wang & Yang, Xinmin & Zhao, Yong, 2023. "Memory gradient method for multiobjective optimization," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    11. Xiaopeng Zhao & Jen-Chih Yao, 2022. "Linear convergence of a nonmonotone projected gradient method for multiobjective optimization," Journal of Global Optimization, Springer, vol. 82(3), pages 577-594, March.
    12. Chen, Jian & Tang, Liping & Yang, Xinmin, 2023. "A Barzilai-Borwein descent method for multiobjective optimization problems," European Journal of Operational Research, Elsevier, vol. 311(1), pages 196-209.
    13. Feng Guo & Liguo Jiao, 2023. "A new scheme for approximating the weakly efficient solution set of vector rational optimization problems," Journal of Global Optimization, Springer, vol. 86(4), pages 905-930, August.
    14. Miglierina, E. & Molho, E. & Recchioni, M.C., 2008. "Box-constrained multi-objective optimization: A gradient-like method without "a priori" scalarization," European Journal of Operational Research, Elsevier, vol. 188(3), pages 662-682, August.
    15. Erik Alex Papa Quiroz & Nancy Baygorrea Cusihuallpa & Nelson Maculan, 2020. "Inexact Proximal Point Methods for Multiobjective Quasiconvex Minimization on Hadamard Manifolds," Journal of Optimization Theory and Applications, Springer, vol. 186(3), pages 879-898, September.
    16. Villacorta, Kely D.V. & Oliveira, P. Roberto, 2011. "An interior proximal method in vector optimization," European Journal of Operational Research, Elsevier, vol. 214(3), pages 485-492, November.
    17. Lu-Chuan Ceng & Sy-Ming Guu & Jen-Chih Yao, 2014. "Hybrid methods with regularization for minimization problems and asymptotically strict pseudocontractive mappings in the intermediate sense," Journal of Global Optimization, Springer, vol. 60(4), pages 617-634, December.
    18. Recchioni, Maria Cristina & Tedeschi, Gabriele, 2017. "From bond yield to macroeconomic instability: A parsimonious affine model," European Journal of Operational Research, Elsevier, vol. 262(3), pages 1116-1135.
    19. Mingcheng Zuo & Yuan Xue, 2024. "Population Feasibility State Guided Autonomous Constrained Multi-Objective Evolutionary Optimization," Mathematics, MDPI, vol. 12(6), pages 1-24, March.
    20. G. Cocchi & M. Lapucci, 2020. "An augmented Lagrangian algorithm for multi-objective optimization," Computational Optimization and Applications, Springer, vol. 77(1), pages 29-56, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:89:y:2024:i:3:d:10.1007_s10589-024-00609-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.