IDEAS home Printed from https://ideas.repec.org/f/pbo498.html
   My authors  Follow this author

Jérôme Bolte
(Jerome Bolte)

Personal Details

First Name:Jerome
Middle Name:
Last Name:Bolte
Suffix:
RePEc Short-ID:pbo498
[This author has chosen not to make the email address public]

Affiliation

Groupe de Recherche en Économie Mathématique et Quantitative (GREMAQ)
Toulouse School of Economics (TSE)

Toulouse, France
http://www-gremaq.univ-tlse1.fr/
RePEc:edi:getlsfr (more details at EDIRC)

Research output

as
Jump to: Working papers Articles

Working papers

  1. Bolte, Jérôme & Combettes, Cyrille & Pauwels, Edouard, 2022. "The Iterates of the Frank-Wolfe Algorithm May Not Converge," TSE Working Papers 22-1311, Toulouse School of Economics (TSE).
  2. Bolte, Jérôme & Pauwels, Edouard & Silveti-Falls, Antonio & Le, Tam, 2022. "Nonsmooth Implicit Differentiation for Machine Learning and Optimization," TSE Working Papers 22-1314, Toulouse School of Economics (TSE).
  3. Villeneuve, Stéphane & Bolte, Jérôme & Miclo, Laurent, 2022. "Swarm gradient dynamics for global optimization: the mean-field limit case," TSE Working Papers 22-1302, Toulouse School of Economics (TSE).
  4. Le, Tam & Bolte, Jérôme & Pauwels, Edouard, 2022. "Subgradient sampling for nonsmooth nonconvex minimization," TSE Working Papers 22-1310, Toulouse School of Economics (TSE).
  5. Bolte, Jérôme & Pauwels, Edouard, 2021. "A mathematical model for automatic differentiation in machine learning," TSE Working Papers 21-1184, Toulouse School of Economics (TSE).
  6. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).
  7. Bolte, Jérôme & Pauwels, Edouard, 2020. "Curiosities and counterexamples in smooth convex optimization," TSE Working Papers 20-1080, Toulouse School of Economics (TSE).
  8. Bolte, Jérôme & Pauwels, Edouard & Rios-Zertuche, Rodolfo, 2020. "Long term dynamics of the subgradient method for Lipschitz path differentiable functions," TSE Working Papers 20-1110, Toulouse School of Economics (TSE).
  9. Jérôme Bolte & Zheng Chen & Edouard Pauwels, 2020. "The multiproximal linearization method for convex composite problems," Post-Print hal-03170605, HAL.
  10. Bolte, Jérôme & Pauwels, Edouard, 2019. "Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning," TSE Working Papers 19-1044, Toulouse School of Economics (TSE).
  11. Bolte, Jérôme & Castera, Camille & Pauwels, Edouard & Févotte, Cédric, 2019. "An Inertial Newton Algorithm for Deep Learning," TSE Working Papers 19-1043, Toulouse School of Economics (TSE).
  12. Blanchet, Adrien & Bolte, Jérôme, 2017. "A family of functional inequalities: lojasiewicz inequalities and displacement convex functions," IAST Working Papers 17-66, Institute for Advanced Study in Toulouse (IAST).
  13. Mariotti, Thomas & Bobtcheff, Catherine & Bolte, Jérôme, 2015. "Researcher's Dilemma," CEPR Discussion Papers 10858, C.E.P.R. Discussion Papers.

    repec:tse:wpaper:126768 is not listed on IDEAS

Articles

  1. Radu-Alexandru Dragomir & Alexandre d’Aspremont & Jérôme Bolte, 2021. "Quartic First-Order Methods for Low-Rank Minimization," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 341-363, May.
  2. Heinz H. Bauschke & Jérôme Bolte & Jiawei Chen & Marc Teboulle & Xianfu Wang, 2019. "On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity," Journal of Optimization Theory and Applications, Springer, vol. 182(3), pages 1068-1087, September.
  3. Jérôme Bolte & Shoham Sabach & Marc Teboulle, 2018. "Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence," Mathematics of Operations Research, INFORMS, vol. 43(4), pages 1210-1232, November.
  4. Heinz H. Bauschke & Jérôme Bolte & Marc Teboulle, 2017. "A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications," Mathematics of Operations Research, INFORMS, vol. 42(2), pages 330-348, May.
  5. Catherine Bobtcheff & Jérôme Bolte & Thomas Mariotti, 2017. "Researcher’s Dilemma," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 84(3), pages 969-1014.
  6. J. Bolte, 2003. "Continuous Gradient Projection Method in Hilbert Spaces," Journal of Optimization Theory and Applications, Springer, vol. 119(2), pages 235-259, November.
    RePEc:inm:ormoor:v:36:y:2011:i:1:p:55-70 is not listed on IDEAS
    RePEc:inm:ormoor:v:41:y:2016:i:2:p:442-465 is not listed on IDEAS
    RePEc:inm:ormoor:v:35:y:2010:i:2:p:438-457 is not listed on IDEAS
    RePEc:inm:ormoor:v:40:y:2015:i:1:p:171-191 is not listed on IDEAS

Citations

Many of the citations below have been collected in an experimental project, CitEc, where a more detailed citation analysis can be found. These are citations from works listed in RePEc that could be analyzed mechanically. So far, only a minority of all works could be analyzed. See under "Corrections" how you can help improve the citation analysis.

Working papers

  1. Villeneuve, Stéphane & Bolte, Jérôme & Miclo, Laurent, 2022. "Swarm gradient dynamics for global optimization: the mean-field limit case," TSE Working Papers 22-1302, Toulouse School of Economics (TSE).

    Cited by:

    1. Miclo, Laurent, 2023. "On the convergence of global-optimization fraudulent stochastic algorithms," TSE Working Papers 23-1437, Toulouse School of Economics (TSE).

  2. Bolte, Jérôme & Pauwels, Edouard, 2021. "A mathematical model for automatic differentiation in machine learning," TSE Working Papers 21-1184, Toulouse School of Economics (TSE).

    Cited by:

    1. Edouard Pauwels, 2021. "Incremental Without Replacement Sampling in Nonconvex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 274-299, July.

  3. Bolte, Jérôme & Pauwels, Edouard, 2020. "Curiosities and counterexamples in smooth convex optimization," TSE Working Papers 20-1080, Toulouse School of Economics (TSE).

    Cited by:

    1. Jean-Pierre Crouzeix, 2022. "On Quasiconvex Functions Which are Convexifiable or Not," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 66-80, June.

  4. Bolte, Jérôme & Pauwels, Edouard & Rios-Zertuche, Rodolfo, 2020. "Long term dynamics of the subgradient method for Lipschitz path differentiable functions," TSE Working Papers 20-1110, Toulouse School of Economics (TSE).

    Cited by:

    1. Le, Tam & Bolte, Jérôme & Pauwels, Edouard, 2022. "Subgradient sampling for nonsmooth nonconvex minimization," TSE Working Papers 22-1310, Toulouse School of Economics (TSE).

  5. Bolte, Jérôme & Pauwels, Edouard, 2019. "Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning," TSE Working Papers 19-1044, Toulouse School of Economics (TSE).

    Cited by:

    1. Bolte, Jérôme & Pauwels, Edouard, 2021. "A mathematical model for automatic differentiation in machine learning," TSE Working Papers 21-1184, Toulouse School of Economics (TSE).
    2. Le, Tam & Bolte, Jérôme & Pauwels, Edouard, 2022. "Subgradient sampling for nonsmooth nonconvex minimization," TSE Working Papers 22-1310, Toulouse School of Economics (TSE).

  6. Bolte, Jérôme & Castera, Camille & Pauwels, Edouard & Févotte, Cédric, 2019. "An Inertial Newton Algorithm for Deep Learning," TSE Working Papers 19-1043, Toulouse School of Economics (TSE).

    Cited by:

    1. Bolte, Jérôme & Pauwels, Edouard, 2021. "A mathematical model for automatic differentiation in machine learning," TSE Working Papers 21-1184, Toulouse School of Economics (TSE).
    2. Bolte, Jérôme & Pauwels, Edouard & Silveti-Falls, Antonio & Le, Tam, 2022. "Nonsmooth Implicit Differentiation for Machine Learning and Optimization," TSE Working Papers 22-1314, Toulouse School of Economics (TSE).
    3. Samir Adly & Hedy Attouch & Van Nam Vo, 2023. "Convergence of Inertial Dynamics Driven by Sums of Potential and Nonpotential Operators with Implicit Newton-Like Damping," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 290-331, July.
    4. Claire Boyer & Antoine Godichon-Baggioni, 2023. "On the asymptotic rate of convergence of Stochastic Newton algorithms and their Weighted Averaged versions," Computational Optimization and Applications, Springer, vol. 84(3), pages 921-972, April.
    5. Emilie Chouzenoux & Jean-Baptiste Fest, 2022. "SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm," Journal of Optimization Theory and Applications, Springer, vol. 195(3), pages 919-952, December.
    6. Bolte, Jérôme & Pauwels, Edouard, 2019. "Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning," TSE Working Papers 19-1044, Toulouse School of Economics (TSE).
    7. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).

  7. Mariotti, Thomas & Bobtcheff, Catherine & Bolte, Jérôme, 2015. "Researcher's Dilemma," CEPR Discussion Papers 10858, C.E.P.R. Discussion Papers.

    Cited by:

    1. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    2. Leonid Tiokhin & Minhua Yan & Thomas J. H. Morgan, 2021. "Competition for priority harms the reliability of science, but reforms can help," Nature Human Behaviour, Nature, vol. 5(7), pages 857-867, July.
    3. Chia-Hui Chen & Junichiro Ishida & Arijit Mukherjee, 2021. "Pioneer, Early Follower or Late Entrant: Entry Dynamics with Learning and Market Competition," ISER Discussion Paper 1132, Institute of Social and Economic Research, Osaka University.
    4. Bobtcheff, Catherine & Levy, Raphaël & Mariotti, Thomas, 2021. "Negative results in science: Blessing or (winner's) curse ?," CEPR Discussion Papers 16024, C.E.P.R. Discussion Papers.
    5. Hoppe-Wewetzer, Heidrun & Katsenos, Georgios & Ozdenoren, Emre, 2023. "The effects of rivalry on scientific progress under public vs private learning," Journal of Economic Theory, Elsevier, vol. 212(C).
    6. Dirk Bergemann & Marco Ottaviani, 2021. "Information Markets and Nonmarkets," Cowles Foundation Discussion Papers 2296, Cowles Foundation for Research in Economics, Yale University.
    7. Sadler, Evan, 2021. "Dead ends," Journal of Economic Theory, Elsevier, vol. 191(C).
    8. Baruffaldi, Stefano & Poege, Felix, 2020. "A Firm Scientific Community: Industry Participation and Knowledge Diffusion," IZA Discussion Papers 13419, Institute of Labor Economics (IZA).
    9. Mariotti, Thomas & Décamps, Jean-Paul & Gensbittel, Fabien, 2021. "Investment Timing and Technological Breakthrough," CEPR Discussion Papers 16246, C.E.P.R. Discussion Papers.
    10. Song, Yangbo & Zhao, Mofei, 2021. "Dynamic R&D competition under uncertainty and strategic disclosure," Journal of Economic Behavior & Organization, Elsevier, vol. 181(C), pages 169-210.
    11. Damien Besancenot & Radu Vranceanu, 2014. "Fear of novelty : a model of scientific discovery with strategic uncertainty," Working Papers hal-01117929, HAL.
    12. Mohan, Vijay, 2019. "On the use of blockchain-based mechanisms to tackle academic misconduct," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    13. Groen-Xu, Moqi & Bös, Gregor & Teixeira, Pedro A. & Voigt, Thomas & Knapp, Bernhard, 2023. "Short-term incentives of research evaluations: Evidence from the UK Research Excellence Framework," Research Policy, Elsevier, vol. 52(6).

Articles

  1. Heinz H. Bauschke & Jérôme Bolte & Jiawei Chen & Marc Teboulle & Xianfu Wang, 2019. "On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity," Journal of Optimization Theory and Applications, Springer, vol. 182(3), pages 1068-1087, September.

    Cited by:

    1. Xue Gao & Xingju Cai & Xiangfeng Wang & Deren Han, 2023. "An alternating structure-adapted Bregman proximal gradient descent algorithm for constrained nonconvex nonsmooth optimization problems and its inertial variant," Journal of Global Optimization, Springer, vol. 87(1), pages 277-300, September.
    2. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
    3. Yin Liu & Sam Davanloo Tajbakhsh, 2023. "Stochastic Composition Optimization of Functions Without Lipschitz Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 239-289, July.
    4. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    5. Zhongming Wu & Chongshou Li & Min Li & Andrew Lim, 2021. "Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems," Journal of Global Optimization, Springer, vol. 79(3), pages 617-644, March.
    6. Jing Zhao & Qiao-Li Dong & Michael Th. Rassias & Fenghui Wang, 2022. "Two-step inertial Bregman alternating minimization algorithm for nonconvex and nonsmooth problems," Journal of Global Optimization, Springer, vol. 84(4), pages 941-966, December.
    7. Emanuel Laude & Peter Ochs & Daniel Cremers, 2020. "Bregman Proximal Mappings and Bregman–Moreau Envelopes Under Relative Prox-Regularity," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 724-761, March.

  2. Jérôme Bolte & Shoham Sabach & Marc Teboulle, 2018. "Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence," Mathematics of Operations Research, INFORMS, vol. 43(4), pages 1210-1232, November.

    Cited by:

    1. Eyal Cohen & Nadav Hallak & Marc Teboulle, 2022. "A Dynamic Alternating Direction of Multipliers for Nonconvex Minimization with Nonlinear Functional Equality Constraints," Journal of Optimization Theory and Applications, Springer, vol. 193(1), pages 324-353, June.
    2. Radu Ioan Bot & Dang-Khoa Nguyen, 2020. "The Proximal Alternating Direction Method of Multipliers in the Nonconvex Setting: Convergence Analysis and Rates," Mathematics of Operations Research, INFORMS, vol. 45(2), pages 682-712, May.
    3. Bolte, Jérôme & Glaudin, Lilian & Pauwels, Edouard & Serrurier, Matthieu, 2021. "A Hölderian backtracking method for min-max and min-min problems," TSE Working Papers 21-1243, Toulouse School of Economics (TSE).

  3. Heinz H. Bauschke & Jérôme Bolte & Marc Teboulle, 2017. "A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications," Mathematics of Operations Research, INFORMS, vol. 42(2), pages 330-348, May.

    Cited by:

    1. Vincenzo Bonifaci, 2021. "A Laplacian approach to $$\ell _1$$ ℓ 1 -norm minimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 441-469, June.
    2. Shota Takahashi & Mituhiro Fukuda & Mirai Tanaka, 2022. "New Bregman proximal type algorithms for solving DC optimization problems," Computational Optimization and Applications, Springer, vol. 83(3), pages 893-931, December.
    3. Yen-Huan Li & Volkan Cevher, 2019. "Convergence of the Exponentiated Gradient Method with Armijo Line Search," Journal of Optimization Theory and Applications, Springer, vol. 181(2), pages 588-607, May.
    4. Regina S. Burachik & Yaohua Hu & Xiaoqi Yang, 2022. "Interior quasi-subgradient method with non-Euclidean distances for constrained quasi-convex optimization problems in hilbert spaces," Journal of Global Optimization, Springer, vol. 83(2), pages 249-271, June.
    5. Yunier Bello-Cruz & Guoyin Li & Tran Thai An Nghia, 2022. "Quadratic Growth Conditions and Uniqueness of Optimal Solution to Lasso," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 167-190, July.
    6. Yunier Bello-Cruz & Guoyin Li & Tran T. A. Nghia, 2021. "On the Linear Convergence of Forward–Backward Splitting Method: Part I—Convergence Analysis," Journal of Optimization Theory and Applications, Springer, vol. 188(2), pages 378-401, February.
    7. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization," Computational Optimization and Applications, Springer, vol. 79(3), pages 681-715, July.
    8. Zamani, Moslem & Abbaszadehpeivasti, Hadi & de Klerk, Etienne, 2023. "The exact worst-case convergence rate of the alternating direction method of multipliers," Other publications TiSEM f30ae9e6-ed19-423f-bd1e-0, Tilburg University, School of Economics and Management.
    9. Christian Kanzow & Patrick Mehlitz, 2022. "Convergence Properties of Monotone and Nonmonotone Proximal Gradient Methods Revisited," Journal of Optimization Theory and Applications, Springer, vol. 195(2), pages 624-646, November.
    10. Yin Liu & Sam Davanloo Tajbakhsh, 2023. "Stochastic Composition Optimization of Functions Without Lipschitz Continuous Gradient," Journal of Optimization Theory and Applications, Springer, vol. 198(1), pages 239-289, July.
    11. Heinz H. Bauschke & Jérôme Bolte & Jiawei Chen & Marc Teboulle & Xianfu Wang, 2019. "On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity," Journal of Optimization Theory and Applications, Springer, vol. 182(3), pages 1068-1087, September.
    12. HyungSeon Oh, 2021. "Distributed optimal power flow," PLOS ONE, Public Library of Science, vol. 16(6), pages 1-27, June.
    13. Wei Peng & Hui Zhang & Xiaoya Zhang & Lizhi Cheng, 2020. "Global complexity analysis of inexact successive quadratic approximation methods for regularized optimization under mild assumptions," Journal of Global Optimization, Springer, vol. 78(1), pages 69-89, September.
    14. Zehui Jia & Jieru Huang & Xingju Cai, 2021. "Proximal-like incremental aggregated gradient method with Bregman distance in weakly convex optimization problems," Journal of Global Optimization, Springer, vol. 80(4), pages 841-864, August.
    15. Xiantao Xiao, 2021. "A Unified Convergence Analysis of Stochastic Bregman Proximal Gradient and Extragradient Methods," Journal of Optimization Theory and Applications, Springer, vol. 188(3), pages 605-627, March.
    16. Masoud Ahookhosh, 2019. "Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity," Mathematical Methods of Operations Research, Springer;Gesellschaft für Operations Research (GOR);Nederlands Genootschap voor Besliskunde (NGB), vol. 89(3), pages 319-353, June.
    17. Daniel Reem & Simeon Reich & Alvaro Pierro, 2019. "A Telescopic Bregmanian Proximal Gradient Method Without the Global Lipschitz Continuity Assumption," Journal of Optimization Theory and Applications, Springer, vol. 182(3), pages 851-884, September.
    18. Bonettini, S. & Prato, M. & Rebegoldi, S., 2021. "New convergence results for the inexact variable metric forward–backward method," Applied Mathematics and Computation, Elsevier, vol. 392(C).
    19. Bolte, Jérôme & Pauwels, Edouard, 2020. "Curiosities and counterexamples in smooth convex optimization," TSE Working Papers 20-1080, Toulouse School of Economics (TSE).
    20. Filip Hanzely & Peter Richtárik, 2021. "Fastest rates for stochastic mirror descent methods," Computational Optimization and Applications, Springer, vol. 79(3), pages 717-766, July.
    21. Fan Wu & Wei Bian, 2023. "Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Convex Optimization Beyond Differentiability," Journal of Optimization Theory and Applications, Springer, vol. 197(2), pages 539-572, May.
    22. S. Bonettini & M. Prato & S. Rebegoldi, 2018. "A block coordinate variable metric linesearch based proximal gradient method," Computational Optimization and Applications, Springer, vol. 71(1), pages 5-52, September.
    23. Peter Ochs & Jalal Fadili & Thomas Brox, 2019. "Non-smooth Non-convex Bregman Minimization: Unification and New Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 181(1), pages 244-278, April.
    24. Radu-Alexandru Dragomir & Alexandre d’Aspremont & Jérôme Bolte, 2021. "Quartic First-Order Methods for Low-Rank Minimization," Journal of Optimization Theory and Applications, Springer, vol. 189(2), pages 341-363, May.
    25. Gadat, Sébastien & Gavra, Ioana, 2021. "Asymptotic study of stochastic adaptive algorithm in non-convex landscape," TSE Working Papers 21-1175, Toulouse School of Economics (TSE).
    26. Masoud Ahookhosh & Le Thi Khanh Hien & Nicolas Gillis & Panagiotis Patrinos, 2021. "A Block Inertial Bregman Proximal Algorithm for Nonsmooth Nonconvex Problems with Application to Symmetric Nonnegative Matrix Tri-Factorization," Journal of Optimization Theory and Applications, Springer, vol. 190(1), pages 234-258, July.
    27. Emanuel Laude & Peter Ochs & Daniel Cremers, 2020. "Bregman Proximal Mappings and Bregman–Moreau Envelopes Under Relative Prox-Regularity," Journal of Optimization Theory and Applications, Springer, vol. 184(3), pages 724-761, March.
    28. Sébastien Gadat & Ioana Gavra, 2022. "Asymptotic study of stochastic adaptive algorithm in non-convex landscape," Post-Print hal-03857182, HAL.
    29. Xin Jiang & Lieven Vandenberghe, 2023. "Bregman Three-Operator Splitting Methods," Journal of Optimization Theory and Applications, Springer, vol. 196(3), pages 936-972, March.
    30. Filip Hanzely & Peter Richtárik & Lin Xiao, 2021. "Accelerated Bregman proximal gradient methods for relatively smooth convex optimization," Computational Optimization and Applications, Springer, vol. 79(2), pages 405-440, June.
    31. Wang Chen & Xinmin Yang & Yong Zhao, 2023. "Conditional gradient method for vector optimization," Computational Optimization and Applications, Springer, vol. 85(3), pages 857-896, July.
    32. Yi Zhou & Yingbin Liang & Lixin Shen, 2019. "A simple convergence analysis of Bregman proximal gradient algorithm," Computational Optimization and Applications, Springer, vol. 73(3), pages 903-912, July.

  4. Catherine Bobtcheff & Jérôme Bolte & Thomas Mariotti, 2017. "Researcher’s Dilemma," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 84(3), pages 969-1014.
    See citations under working paper version above.
  5. J. Bolte, 2003. "Continuous Gradient Projection Method in Hilbert Spaces," Journal of Optimization Theory and Applications, Springer, vol. 119(2), pages 235-259, November.

    Cited by:

    1. B. Abbas & H. Attouch & Benar F. Svaiter, 2014. "Newton-Like Dynamics and Forward-Backward Methods for Structured Monotone Inclusions in Hilbert Spaces," Journal of Optimization Theory and Applications, Springer, vol. 161(2), pages 331-360, May.
    2. P. Nistri & M. Quincampoix, 2005. "On the Dynamics of a Differential Inclusion Built upon a Nonconvex Constrained Minimization Problem," Journal of Optimization Theory and Applications, Springer, vol. 124(3), pages 659-672, March.
    3. Haixin Ren & Bin Ge & Xiangwu Zhuge, 2023. "Fast Convergence of Inertial Gradient Dynamics with Multiscale Aspects," Journal of Optimization Theory and Applications, Springer, vol. 196(2), pages 461-489, February.
    4. Boţ, Radu Ioan & Kanzler, Laura, 2021. "A forward-backward dynamical approach for nonsmooth problems with block structure coupled by a smooth function," Applied Mathematics and Computation, Elsevier, vol. 394(C).
    5. Sylvain Sorin, 2023. "Continuous Time Learning Algorithms in Optimization and Game Theory," Dynamic Games and Applications, Springer, vol. 13(1), pages 3-24, March.

More information

Research fields, statistics, top rankings, if available.

Statistics

Access and download statistics for all items

Co-authorship network on CollEc

NEP Fields

NEP is an announcement service for new working papers, with a weekly report in each of many fields. This author has had 11 papers announced in NEP. These are the fields, ordered by number of announcements, along with their dates. If the author is listed in the directory of specialists for this field, a link is also provided.
  1. NEP-BIG: Big Data (5) 2019-10-21 2019-10-21 2021-03-15 2022-04-18 2022-04-18. Author is listed
  2. NEP-CMP: Computational Economics (4) 2019-10-21 2019-10-21 2021-03-15 2022-04-18
  3. NEP-GTH: Game Theory (3) 2013-03-02 2013-03-09 2015-10-10
  4. NEP-HPE: History and Philosophy of Economics (3) 2013-03-02 2013-03-09 2015-10-10
  5. NEP-SOG: Sociology of Economics (3) 2013-03-02 2013-03-09 2015-10-10
  6. NEP-CTA: Contract Theory and Applications (2) 2013-03-02 2013-03-09
  7. NEP-HRM: Human Capital and Human Resource Management (2) 2013-03-02 2013-03-09
  8. NEP-ORE: Operations Research (2) 2019-10-21 2022-02-28
  9. NEP-BAN: Banking (1) 2022-04-11
  10. NEP-CWA: Central and Western Asia (1) 2013-03-09
  11. NEP-INO: Innovation (1) 2015-10-10
  12. NEP-ISF: Islamic Finance (1) 2021-09-06
  13. NEP-MIC: Microeconomics (1) 2015-10-10

Corrections

All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. For general information on how to correct material on RePEc, see these instructions.

To update listings or check citations waiting for approval, Jerome Bolte
(Jerome Bolte) should log into the RePEc Author Service.

To make corrections to the bibliographic information of a particular item, find the technical contact on the abstract page of that item. There, details are also given on how to add or correct references and citations.

To link different versions of the same work, where versions have a different title, use this form. Note that if the versions have a very similar title and are in the author's profile, the links will usually be created automatically.

Please note that most corrections can take a couple of weeks to filter through the various RePEc services.

IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.