IDEAS home Printed from https://ideas.repec.org/a/spr/joptap/v206y2025i3d10.1007_s10957-025-02737-x.html
   My bibliography  Save this article

Generalized Conditional Gradient Methods for Multiobjective Composite Optimization Problems with Hölder Condition

Author

Listed:
  • Wang Chen

    (Chongqing Normal University)

  • Liping Tang

    (Chongqing Normal University)

  • Xinmin Yang

    (Chongqing Normal University)

Abstract

In this paper, we deal with multiobjective composite optimization problems, where each objective function is a combination of smooth and possibly non-smooth functions. We first propose a parameter-dependent generalized conditional gradient method to solve this problem. The step size in this method requires prior knowledge of the parameters related to the Hölder continuity of the gradient of the smooth function. The convergence properties of this method are then established. Given that these parameters may be unknown or, if known, may not be unique, the first method may encounter implementation challenges or slow convergence. To address this, we further propose a parameter-free version of the first method that determines the step size using a local quadratic upper approximation and an adaptive line search strategy, eliminating the need for any problem-specific parameters. The performance of the proposed methods is demonstrated on several test problems involving the indicator function and an uncertainty function.

Suggested Citation

  • Wang Chen & Liping Tang & Xinmin Yang, 2025. "Generalized Conditional Gradient Methods for Multiobjective Composite Optimization Problems with Hölder Condition," Journal of Optimization Theory and Applications, Springer, vol. 206(3), pages 1-27, September.
  • Handle: RePEc:spr:joptap:v:206:y:2025:i:3:d:10.1007_s10957-025-02737-x
    DOI: 10.1007/s10957-025-02737-x
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10957-025-02737-x
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10957-025-02737-x?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to

    for a different version of it.

    References listed on IDEAS

    as
    1. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2019. "Proximal gradient methods for multiobjective optimization and their applications," Computational Optimization and Applications, Springer, vol. 72(2), pages 339-361, March.
    2. Hiroki Tanabe & Ellen H. Fukuda & Nobuo Yamashita, 2023. "An accelerated proximal gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 86(2), pages 421-455, November.
    3. Konstantin Sonntag & Sebastian Peitz, 2024. "Fast Multiobjective Gradient Methods with Nesterov Acceleration via Inertial Gradient-Like Systems," Journal of Optimization Theory and Applications, Springer, vol. 201(2), pages 539-582, May.
    4. G. Cocchi & G. Liuzzi & S. Lucidi & M. Sciandrone, 2020. "On the convergence of steepest descent methods for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 77(1), pages 1-27, September.
    5. P. B. Assunção & O. P. Ferreira & L. F. Prudente, 2021. "Conditional gradient method for multiobjective optimization," Computational Optimization and Applications, Springer, vol. 78(3), pages 741-768, April.
    6. Xiaopeng Zhao & Jen-Chih Yao, 2022. "Linear convergence of a nonmonotone projected gradient method for multiobjective optimization," Journal of Global Optimization, Springer, vol. 82(3), pages 577-594, March.
    7. NESTEROV, Yurii, 2015. "Universal gradient methods for convex optimization problems," LIDAM Reprints CORE 2701, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    8. Kristian Bredies & Dirk Lorenz & Peter Maass, 2009. "A generalized conditional gradient method and its connection to an iterative shrinkage method," Computational Optimization and Applications, Springer, vol. 42(2), pages 173-193, March.
    9. Qingjie Hu & Ruyun Li & Yanyan Zhang & Zhibin Zhu, 2024. "On the Extension of Dai-Liao Conjugate Gradient Method for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(1), pages 810-843, October.
    10. Douglas S. Gonçalves & Max L. N. Gonçalves & Jefferson G. Melo, 2025. "Improved Convergence Rates for the Multiobjective Frank–Wolfe Method," Journal of Optimization Theory and Applications, Springer, vol. 205(2), pages 1-25, May.
    11. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    12. Xinmin Yang, 2018. "Generalized Preinvexity and Second Order Duality in Multiobjective Programming," Springer Optimization and Its Applications, Springer, number 978-981-13-1981-5, January.
    13. Xiaopeng Zhao & Debdas Ghosh & Xiaolong Qin & Christiane Tammer & Jen-Chih Yao, 2025. "On the convergence analysis of a proximal gradient method for multiobjective optimization," TOP: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 33(1), pages 102-132, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Yu Chen & Helong Chen & Zhibin Zhu, 2026. "A Three-Term Conjugate Gradient-Type Method with Sufficient Descent Property for Vector Optimization," Journal of Optimization Theory and Applications, Springer, vol. 208(1), pages 1-42, January.
    2. Anteneh Getachew Gebrie & Ellen Hidemi Fukuda, 2025. "Adaptive Generalized Conditional Gradient Method for Multiobjective Optimization," Journal of Optimization Theory and Applications, Springer, vol. 206(1), pages 1-27, July.
    3. Douglas S. Gonçalves & Max L. N. Gonçalves & Jefferson G. Melo, 2024. "An away-step Frank–Wolfe algorithm for constrained multiobjective optimization," Computational Optimization and Applications, Springer, vol. 88(3), pages 759-781, July.
    4. Douglas S. Gonçalves & Max L. N. Gonçalves & Jefferson G. Melo, 2025. "Improved Convergence Rates for the Multiobjective Frank–Wolfe Method," Journal of Optimization Theory and Applications, Springer, vol. 205(2), pages 1-25, May.
    5. Jian Chen & Wang Chen & Liping Tang & Xinmin Yang, 2026. "Preconditioned Barzilai-Borwein Methods for Multiobjective Optimization Problems," Journal of Optimization Theory and Applications, Springer, vol. 208(1), pages 1-43, January.
    6. Xiaopeng Zhao & Ravi Raushan & Debdas Ghosh & Jen-Chih Yao & Min Qi, 2025. "Proximal gradient method for convex multiobjective optimization problems without Lipschitz continuous gradients," Computational Optimization and Applications, Springer, vol. 91(1), pages 27-66, May.
    7. Qing-Rui He & Sheng-Jie Li & Bo-Ya Zhang & Chun-Rong Chen, 2024. "A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization," Computational Optimization and Applications, Springer, vol. 89(3), pages 805-842, December.
    8. Feng Guo & Liguo Jiao, 2023. "A new scheme for approximating the weakly efficient solution set of vector rational optimization problems," Journal of Global Optimization, Springer, vol. 86(4), pages 905-930, August.
    9. Chen, Kangming & Fukuda, Ellen Hidemi & Sato, Hiroyuki, 2025. "Nonlinear conjugate gradient method for vector optimization on Riemannian manifolds with retraction and vector transport," Applied Mathematics and Computation, Elsevier, vol. 486(C).
    10. Matteo Lapucci & Pierluigi Mansueto, 2024. "Cardinality-Constrained Multi-objective Optimization: Novel Optimality Conditions and Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 201(1), pages 323-351, April.
    11. P. Kesarwani & P. K. Shukla & J. Dutta & K. Deb, 2022. "Approximations for Pareto and Proper Pareto solutions and their KKT conditions," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 96(1), pages 123-148, August.
    12. Shahabeddin Najafi & Masoud Hajarian, 2024. "Multiobjective BFGS method for optimization on Riemannian manifolds," Computational Optimization and Applications, Springer, vol. 87(2), pages 337-354, March.
    13. Elena Tovbis & Vladimir Krutikov & Predrag Stanimirović & Vladimir Meshechkin & Aleksey Popov & Lev Kazakovtsev, 2023. "A Family of Multi-Step Subgradient Minimization Methods," Mathematics, MDPI, vol. 11(10), pages 1-24, May.
    14. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    15. Bo Jiang & Tianyi Lin & Shiqian Ma & Shuzhong Zhang, 2019. "Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis," Computational Optimization and Applications, Springer, vol. 72(1), pages 115-157, January.
    16. Gonçalves, M.L.N. & Lima, F.S. & Prudente, L.F., 2022. "A study of Liu-Storey conjugate gradient methods for vector optimization," Applied Mathematics and Computation, Elsevier, vol. 425(C).
    17. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    18. Chen, Wang & Yang, Xinmin & Zhao, Yong, 2023. "Memory gradient method for multiobjective optimization," Applied Mathematics and Computation, Elsevier, vol. 443(C).
    19. Fedor Stonyakin & Alexander Gasnikov & Pavel Dvurechensky & Alexander Titov & Mohammad Alkousa, 2022. "Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle," Journal of Optimization Theory and Applications, Springer, vol. 194(3), pages 988-1013, September.
    20. Fedor Stonyakin & Ilya Kuruzov & Boris Polyak, 2023. "Stopping Rules for Gradient Methods for Non-convex Problems with Additive Noise in Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(2), pages 531-551, August.

    More about this item

    Keywords

    ;
    ;
    ;
    ;
    ;
    ;

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:joptap:v:206:y:2025:i:3:d:10.1007_s10957-025-02737-x. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.