Primal-dual subgradient methods for convex problems
AbstractIn this paper we present a new approach for constructing subgradient schemes for different types of nonsmooth problems with convex structure. Our methods are primaldual since they are always able to generate a feasible approximation to the optimum of an appropriately formulated dual problem. Besides other advantages, this useful feature provides the methods with a reliable stopping criterion. The proposed schemes differ from the classical approaches (divergent series methods, mirror descent methods) by presence of two control sequences. The first sequence is responsible for aggregating the support functions in the dual space, and the second one establishes a dynamically updated scale between the primal and dual spaces. This additional flexibility allows to guarantee a boundedness of the sequence of primal test points even in the case of unbounded feasible set. We present the variants of subgradient schemes for nonsmooth convex minimization, minimax problems, saddle point problems, variational inequalities, and stochastic optimization. In all situations our methods are proved to be optimal from the view point of worst-case black-box lower complexity bounds.
Download InfoIf you experience problems downloading a file, check if you have the proper application to view it first. In case of further problems read the IDEAS help page. Note that these files are not on the IDEAS site. Please be patient as the files may be large.
Bibliographic InfoPaper provided by Université catholique de Louvain, Center for Operations Research and Econometrics (CORE) in its series CORE Discussion Papers with number 2005067.
Date of creation: 00 Oct 2005
Date of revision:
Contact details of provider:
Postal: Voie du Roman Pays 34, 1348 Louvain-la-Neuve (Belgium)
Fax: +32 10474304
Web page: http://www.uclouvain.be/core
More information through EDIRC
convex optimization; subgradient methods; non-smooth optimization; minimax problems; saddle points; variational inequalities; stochastic optimization; black-box methods; lower complexity bounds;
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- NESTEROV, Yu, 2003. "Dual extrapolation and its applications for solving variational inequalities and related problems," CORE Discussion Papers 2003068, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- repec:fth:louvco:2000/13 is not listed on IDEAS
- Nesterov, Y. & Vial, J.-P., 2000. "Confidence Level Solutions for Stochastic Programming," Papers 2000.05, Ecole des Hautes Etudes Commerciales, Universite de Geneve-.
- NESTEROV, Yu & VIAL, Jean-Philippe, 2000. "Confidence level solutions for stochastic programming," CORE Discussion Papers 2000013, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
- NESTEROV, Yu., 2005. "Minimizing functions with bounded variation of subgradients," CORE Discussion Papers 2005079, Université catholique de Louvain, Center for Operations Research and Econometrics (CORE).
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Alain GILLIS).
If references are entirely missing, you can add them using this form.