IDEAS home Printed from https://ideas.repec.org/a/spr/coopap/v87y2024i2d10.1007_s10589-023-00533-9.html
   My bibliography  Save this article

Efficiency of higher-order algorithms for minimizing composite functions

Author

Listed:
  • Yassine Nabou

    (National University of Science and Technology Politehnica Bucharest)

  • Ion Necoara

    (National University of Science and Technology Politehnica Bucharest
    Gheorghe Mihoc-Caius Iacob Institute of Mathematical Statistics and Applied Mathematics of the Romanian Academy)

Abstract

Composite minimization involves a collection of functions which are aggregated in a nonsmooth manner. It covers, as a particular case, smooth approximation of minimax games, minimization of max-type functions, and simple composite minimization problems, where the objective function has a nonsmooth component. We design a higher-order majorization algorithmic framework for fully composite problems (possibly nonconvex). Our framework replaces each component with a higher-order surrogate such that the corresponding error function has a higher-order Lipschitz continuous derivative. We present convergence guarantees for our method for composite optimization problems with (non)convex and (non)smooth objective function. In particular, we prove stationary point convergence guarantees for general nonconvex (possibly nonsmooth) problems and under Kurdyka–Lojasiewicz (KL) property of the objective function we derive improved rates depending on the KL parameter. For convex (possibly nonsmooth) problems we also provide sublinear convergence rates.

Suggested Citation

  • Yassine Nabou & Ion Necoara, 2024. "Efficiency of higher-order algorithms for minimizing composite functions," Computational Optimization and Applications, Springer, vol. 87(2), pages 441-473, March.
  • Handle: RePEc:spr:coopap:v:87:y:2024:i:2:d:10.1007_s10589-023-00533-9
    DOI: 10.1007/s10589-023-00533-9
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10589-023-00533-9
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10589-023-00533-9?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. NESTEROV, Yurii, 2013. "Gradient methods for minimizing composite functions," LIDAM Reprints CORE 2510, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    2. Nicholas I. M. Gould & Tyrone Rees & Jennifer A. Scott, 2019. "Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems," Computational Optimization and Applications, Springer, vol. 73(1), pages 1-35, May.
    3. Ion Necoara & Yurii Nesterov & François Glineur, 2019. "Linear convergence of first order methods for non-strongly convex optimization," LIDAM Reprints CORE 3000, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ching-pei Lee & Stephen J. Wright, 2019. "Inexact Successive quadratic approximation for regularized optimization," Computational Optimization and Applications, Springer, vol. 72(3), pages 641-674, April.
    2. Huynh Ngai & Ta Anh Son, 2022. "Generalized Nesterov’s accelerated proximal gradient algorithms with convergence rate of order o(1/k2)," Computational Optimization and Applications, Springer, vol. 83(2), pages 615-649, November.
    3. Juan José Maulén & Juan Peypouquet, 2023. "A Speed Restart Scheme for a Dynamics with Hessian-Driven Damping," Journal of Optimization Theory and Applications, Springer, vol. 199(2), pages 831-855, November.
    4. Adrien B. Taylor & Julien M. Hendrickx & François Glineur, 2018. "Exact Worst-Case Convergence Rates of the Proximal Gradient Method for Composite Convex Minimization," Journal of Optimization Theory and Applications, Springer, vol. 178(2), pages 455-476, August.
    5. Olivier Fercoq & Zheng Qu, 2020. "Restarting the accelerated coordinate descent method with a rough strong convexity estimate," Computational Optimization and Applications, Springer, vol. 75(1), pages 63-91, January.
    6. Xiaoya Zhang & Wei Peng & Hui Zhang, 2022. "Inertial proximal incremental aggregated gradient method with linear convergence guarantees," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 96(2), pages 187-213, October.
    7. Ren Jiang & Zhifeng Ji & Wuling Mo & Suhua Wang & Mingjun Zhang & Wei Yin & Zhen Wang & Yaping Lin & Xueke Wang & Umar Ashraf, 2022. "A Novel Method of Deep Learning for Shear Velocity Prediction in a Tight Sandstone Reservoir," Energies, MDPI, vol. 15(19), pages 1-20, September.
    8. Qingyang Liu & Yuping Zhang, 2023. "Integrative Structural Learning of Mixed Graphical Models via Pseudo-likelihood," Statistics in Biosciences, Springer;International Chinese Statistical Association, vol. 15(3), pages 562-582, December.
    9. Puya Latafat & Andreas Themelis & Silvia Villa & Panagiotis Patrinos, 2025. "On the Convergence of Proximal Gradient Methods for Convex Simple Bilevel Optimization," Journal of Optimization Theory and Applications, Springer, vol. 204(3), pages 1-36, March.
    10. Masaru Ito, 2016. "New results on subgradient methods for strongly convex optimization problems with a unified analysis," Computational Optimization and Applications, Springer, vol. 65(1), pages 127-172, September.
    11. TAYLOR, Adrien B. & HENDRICKX, Julien M. & François GLINEUR, 2016. "Exact worst-case performance of first-order methods for composite convex optimization," LIDAM Discussion Papers CORE 2016052, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
    12. Dimitris Bertsimas & Ryan Cory-Wright, 2022. "A Scalable Algorithm for Sparse Portfolio Selection," INFORMS Journal on Computing, INFORMS, vol. 34(3), pages 1489-1511, May.
    13. Weibin Mo & Yufeng Liu, 2022. "Efficient learning of optimal individualized treatment rules for heteroscedastic or misspecified treatment‐free effect models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(2), pages 440-472, April.
    14. Liu, Yulan & Bi, Shujun, 2019. "Error bounds for non-polyhedral convex optimization and applications to linear convergence of FDM and PGM," Applied Mathematics and Computation, Elsevier, vol. 358(C), pages 418-435.
    15. Sun, Shilin & Wang, Tianyang & Yang, Hongxing & Chu, Fulei, 2022. "Damage identification of wind turbine blades using an adaptive method for compressive beamforming based on the generalized minimax-concave penalty function," Renewable Energy, Elsevier, vol. 181(C), pages 59-70.
    16. Saif Eddin Jabari & Nikolaos M. Freris & Deepthi Mary Dilip, 2020. "Sparse Travel Time Estimation from Streaming Data," Transportation Science, INFORMS, vol. 54(1), pages 1-20, January.
    17. Le Thi Khanh Hien & Cuong V. Nguyen & Huan Xu & Canyi Lu & Jiashi Feng, 2019. "Accelerated Randomized Mirror Descent Algorithms for Composite Non-strongly Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 541-566, May.
    18. Ya-Feng Liu & Xin Liu & Shiqian Ma, 2019. "On the Nonergodic Convergence Rate of an Inexact Augmented Lagrangian Framework for Composite Convex Programming," Mathematics of Operations Research, INFORMS, vol. 44(2), pages 632-650, May.
    19. Reza Eghbali & Maryam Fazel, 2017. "Decomposable norm minimization with proximal-gradient homotopy algorithm," Computational Optimization and Applications, Springer, vol. 66(2), pages 345-381, March.
    20. NESTEROV, Yu. & SHIKHMAN, Vladimir, 2014. "Convergent subgradient methods for nonsmooth convex minimization," LIDAM Discussion Papers CORE 2014005, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:coopap:v:87:y:2024:i:2:d:10.1007_s10589-023-00533-9. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.