IDEAS home Printed from https://ideas.repec.org/r/inm/ormoor/v42y2017i2p330-348.html
   My bibliography  Save this item

A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Vincenzo Bonifaci, 2021. "A Laplacian approach to $$\ell _1$$ ℓ 1 -norm minimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 441-469, June.
  2. Shota Takahashi & Mituhiro Fukuda & Mirai Tanaka, 2022. "New Bregman proximal type algorithms for solving DC optimization problems," Computational Optimization and Applications, Springer, vol. 83(3), pages 893-931, December.
  3. Bonettini, S. & Prato, M. & Rebegoldi, S., 2021. "New convergence results for the inexact variable metric forward–backward method," Applied Mathematics and Computation, Elsevier, vol. 392(C).
  4. Yen-Huan Li & Volkan Cevher, 2019. "Convergence of the Exponentiated Gradient Method with Armijo Line Search," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 588-607, May.
  5. Regina S. Burachik & Yaohua Hu & Xiaoqi Yang, 2022. "Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in hilbert spaces," Journal of Global Optimization, Springer, vol. 83(2), pages 249-271, June.
  6. Yunier Bello-Cruz & Guoyin Li & Tran Thai An Nghia, 2022. "Quadratic Growth Conditions and Uniqueness of Optimal Solution to Lasso," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 167-190, July.
  7. Bolte, Jérôme & Pauwels, Edouard, 2020. "Curiosities and counterexamples in smooth convex optimization," TSE Working Papers 20-1080, Toulouse School of Economics (TSE).
  8. Yunier Bello-Cruz & Guoyin Li & Tran T. A. Nghia, 2021. "On the Linear Convergence of Forward–Backward Splitting Method: Part I—Convergence Analysis," Journal of Optimization Theory and Applications, Springer, vol. 188(2), pages 378-401, February.
  9. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
  10. Filip Hanzely & Peter Richtárik, 2021. "Fastest rates for stochastic mirror descent methods," Computational Optimization and Applications, Springer, vol. 79(3), pages 717-766, July.
  11. Zamani, Moslem & Abbaszadehpeivasti, Hadi & de Klerk, Etienne, 2023. "The exact worst-case convergence rate of the alternating direction method of multipliers," Other publications TiSEM f30ae9e6-ed19-423f-bd1e-0, Tilburg University, School of Economics and Management.
  12. Christian Kanzow & Patrick Mehlitz, 2022. "Convergence Properties of Monotone and Nonmonotone Proximal Gradient Methods Revisited," Journal of Optimization Theory and Applications, Springer, vol. 195(2), pages 624-646, November.
  13. Fan Wu & Wei Bian, 2023. "Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Convex Optimization Beyond Differentiability," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 539-572, May.
  14. S. Bonettini & M. Prato & S. Rebegoldi, 2018. "A block coordinate variable metric linesearch based proximal gradient method," Computational Optimization and Applications, Springer, vol. 71(1), pages 5-52, September.
  15. Peter Ochs & Jalal Fadili & Thomas Brox, 2019. "Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 181(1), pages 244-278, April.
  16. Radu-Alexandru Dragomir & Alexandre d’Aspremont & Jérôme Bolte, 2021. "Quartic First-Order Methods for Low-Rank Minimization," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 341-363, May.
  17. Yin Liu & Sam Davanloo Tajbakhsh, 2023. "Stochastic Composition Optimization of Functions Without Lipschitz Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 239-289, July.
  18. Gadat, Sébastien & Gavra, Ioana, 2021. "Asymptotic study of stochastic adaptive algorithm in non-convex landscape," TSE Working Papers 21-1175, Toulouse School of Economics (TSE).
  19. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
  20. Heinz H. Bauschke & Jérôme Bolte & Jiawei Chen & Marc Teboulle & Xianfu Wang, 2019. "On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity," Journal of Optimization Theory and Applications, Springer, vol. 182(3), pages 1068-1087, September.
  21. Emanuel Laude & Peter Ochs & Daniel Cremers, 2020. "Bregman Proximal Mappings and Bregman–Moreau Envelopes Under Relative Prox-Regularity," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 724-761, March.
  22. HyungSeon Oh, 2021. "Distributed optimal power flow," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-27, June.
  23. Wei Peng & Hui Zhang & Xiaoya Zhang & Lizhi Cheng, 2020. "Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions," Journal of Global Optimization, Springer, vol. 78(1), pages 69-89, September.
  24. Sébastien Gadat & Ioana Gavra, 2022. "Asymptotic study of stochastic adaptive algorithm in non-convex landscape," Post-Print hal-03857182, HAL.
  25. Xin Jiang & Lieven Vandenberghe, 2023. "Bregman Three-Operator Splitting Methods," Journal of Optimization Theory and Applications, Springer, vol. 196(3), pages 936-972, March.
  26. Zehui Jia & Jieru Huang & Xingju Cai, 2021. "Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems," Journal of Global Optimization, Springer, vol. 80(4), pages 841-864, August.
  27. Xiantao Xiao, 2021. "A Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 605-627, March.
  28. Filip Hanzely & Peter Richtárik & Lin Xiao, 2021. "Accelerated Bregman proximal gradient methods for relatively smooth convex optimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 405-440, June.
  29. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
  30. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
  31. Yi Zhou & Yingbin Liang & Lixin Shen, 2019. "A simple convergence analysis of Bregman proximal gradient algorithm," Computational Optimization and Applications, Springer, vol. 73(3), pages 903-912, July.
  32. Daniel Reem & Simeon Reich & Alvaro Pierro, 2019. "A Telescopic Bregmanian Proximal Gradient Method Without the Global Lipschitz Continuity Assumption," Journal of Optimization Theory and Applications, Springer, vol. 182(3), pages 851-884, September.
IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.