IDEAS home Printed from https://ideas.repec.org/r/cor/louvrp/2851.html
   My bibliography  Save this item

Random gradient-free minimization of convex functions

Citations

Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
as


Cited by:

  1. Ryota Nozawa & Pierre-Louis Poirion & Akiko Takeda, 2025. "Zeroth-Order Random Subspace Algorithm for Non-smooth Convex Optimization," Journal of Optimization Theory and Applications, Springer, vol. 204(3), pages 1-31, March.
  2. Aleksandr Lobanov & Andrew Veprikov & Georgiy Konin & Aleksandr Beznosikov & Alexander Gasnikov & Dmitry Kovalev, 2023. "Non-smooth setting of stochastic decentralized convex optimization problem over time-varying Graphs," Computational Management Science, Springer, vol. 20(1), pages 1-55, December.
  3. Rajeeva Laxman Karandikar & Mathukumalli Vidyasagar, 2024. "Convergence Rates for Stochastic Approximation: Biased Noise with Unbounded Variance, and Applications," Journal of Optimization Theory and Applications, Springer, vol. 203(3), pages 2412-2450, December.
  4. Vyacheslav Kungurtsev & Francesco Rinaldi & Damiano Zeffiro, 2024. "Retraction-Based Direct Search Methods for Derivative Free Riemannian Optimization," Journal of Optimization Theory and Applications, Springer, vol. 203(2), pages 1710-1735, November.
  5. Stefan Wager & Kuang Xu, 2021. "Experimenting in Equilibrium," Management Science, INFORMS, vol. 67(11), pages 6694-6715, November.
  6. Marco Boresta & Tommaso Colombo & Alberto Santis & Stefano Lucidi, 2022. "A Mixed Finite Differences Scheme for Gradient Approximation," Journal of Optimization Theory and Applications, Springer, vol. 194(1), pages 1-24, July.
  7. Marco Rando & Cesare Molinari & Silvia Villa & Lorenzo Rosasco, 2024. "Stochastic zeroth order descent with structured directions," Computational Optimization and Applications, Springer, vol. 89(3), pages 691-727, December.
  8. Ghadimi, Saeed & Powell, Warren B., 2024. "Stochastic search for a parametric cost function approximation: Energy storage with rolling forecasts," European Journal of Operational Research, Elsevier, vol. 312(2), pages 641-652.
  9. Flavia Chorobura & Ion Necoara, 2024. "Coordinate descent methods beyond smoothness and separability," Computational Optimization and Applications, Springer, vol. 88(1), pages 107-149, May.
  10. Katya Scheinberg, 2022. "Finite Difference Gradient Approximation: To Randomize or Not?," INFORMS Journal on Computing, INFORMS, vol. 34(5), pages 2384-2388, September.
  11. Tianyu Wang & Yasong Feng, 2024. "Convergence Rates of Zeroth Order Gradient Descent for Łojasiewicz Functions," INFORMS Journal on Computing, INFORMS, vol. 36(6), pages 1611-1633, December.
  12. Ghaderi, Susan & Ahookhosh, Masoud & Arany, Adam & Skupin, Alexander & Patrinos, Panagiotis & Moreau, Yves, 2024. "Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions," Applied Mathematics and Computation, Elsevier, vol. 464(C).
  13. Dvurechensky, Pavel & Gorbunov, Eduard & Gasnikov, Alexander, 2021. "An accelerated directional derivative method for smooth stochastic convex optimization," European Journal of Operational Research, Elsevier, vol. 290(2), pages 601-621.
  14. Jun Xie & Qingyun Yu & Chi Cao, 2018. "A Distributed Randomized Gradient-Free Algorithm for the Non-Convex Economic Dispatch Problem," Energies, MDPI, vol. 11(1), pages 1-15, January.
  15. Nikita Kornilov & Alexander Gasnikov & Pavel Dvurechensky & Darina Dvinskikh, 2023. "Gradient-free methods for non-smooth convex stochastic optimization with heavy-tailed noise on convex compact," Computational Management Science, Springer, vol. 20(1), pages 1-43, December.
  16. Jean-Jacques Forneron, 2023. "Noisy, Non-Smooth, Non-Convex Estimation of Moment Condition Models," Papers 2301.07196, arXiv.org, revised Feb 2023.
  17. Geovani Nunes Grapiglia, 2023. "Quadratic regularization methods with finite-difference gradient approximations," Computational Optimization and Applications, Springer, vol. 85(3), pages 683-703, July.
  18. Michael R. Metel & Akiko Takeda, 2022. "Perturbed Iterate SGD for Lipschitz Continuous Loss Functions," Journal of Optimization Theory and Applications, Springer, vol. 195(2), pages 504-547, November.
  19. Jingxu Xu & Zeyu Zheng, 2023. "Gradient-Based Simulation Optimization Algorithms via Multi-Resolution System Approximations," INFORMS Journal on Computing, INFORMS, vol. 35(3), pages 633-651, May.
  20. Youssef Diouane & Vyacheslav Kungurtsev & Francesco Rinaldi & Damiano Zeffiro, 2024. "Inexact direct-search methods for bilevel optimization problems," Computational Optimization and Applications, Springer, vol. 88(2), pages 469-490, June.
  21. Yijie Peng & Li Xiao & Bernd Heidergott & L. Jeff Hong & Henry Lam, 2022. "A New Likelihood Ratio Method for Training Artificial Neural Networks," INFORMS Journal on Computing, INFORMS, vol. 34(1), pages 638-655, January.
  22. Jun Xie & Chi Cao, 2017. "Non-Convex Economic Dispatch of a Virtual Power Plant via a Distributed Randomized Gradient-Free Algorithm," Energies, MDPI, vol. 10(7), pages 1-12, July.
  23. Veprikov, Andrey & Bogdanov, Alexander & Minashkin, Vladislav & Beznosikov, Aleksandr, 2024. "New aspects of black box conditional gradient: Variance reduction and one point feedback," Chaos, Solitons & Fractals, Elsevier, vol. 189(P1).
  24. David Kozak & Stephen Becker & Alireza Doostan & Luis Tenorio, 2021. "A stochastic subspace approach to gradient-free optimization in high dimensions," Computational Optimization and Applications, Springer, vol. 79(2), pages 339-368, June.
  25. V. Kungurtsev & F. Rinaldi, 2021. "A zeroth order method for stochastic weakly convex optimization," Computational Optimization and Applications, Springer, vol. 80(3), pages 731-753, December.
  26. Alireza Aghasi & Saeed Ghadimi, 2025. "Fully Zeroth-Order Bilevel Programming via Gaussian Smoothing," Journal of Optimization Theory and Applications, Springer, vol. 205(2), pages 1-39, May.
  27. Zhongruo Wang & Krishnakumar Balasubramanian & Shiqian Ma & Meisam Razaviyayn, 2023. "Zeroth-order algorithms for nonconvex–strongly-concave minimax problems with improved complexities," Journal of Global Optimization, Springer, vol. 87(2), pages 709-740, November.
  28. Hoang Tran & Qiang Du & Guannan Zhang, 2025. "Convergence analysis for a nonlocal gradient descent method via directional Gaussian smoothing," Computational Optimization and Applications, Springer, vol. 90(2), pages 481-513, March.
IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.