Author
Listed:
- Haixiang Zhang
(Department of Mathematics, University of California, Berkeley, California 94720)
- Zeyu Zheng
(Department of Industrial Engineering and Operations Research, University of California, Berkeley, California 94720)
- Javad Lavaei
(Department of Industrial Engineering and Operations Research, University of California, Berkeley, California 94720)
Abstract
We develop and analyze a set of new sequential simulation-optimization algorithms for large-scale multidimensional discrete optimization via simulation problems with a convexity structure. The “large-scale” notion refers to that the discrete decision variable has a large number of values from which to choose on each dimension of the decision variable. The proposed algorithms are targeted to identify a solution that is close to the optimal solution given any precision level with any given probability. To achieve this target, utilizing the convexity structure, our algorithm design does not need to scan all the choices of the decision variable, but instead sequentially draws a subset of choices of the decision variable and uses them to “localize” potentially near-optimal solutions to an adaptively shrinking region. To show the power of the proposed methods based on the localization idea, we first consider one-dimensional large-scale problems. We develop the shrinking uniform sampling algorithm, which is proved to achieve the target with an optimal expected simulation cost under an asymptotic criterion. For multidimensional problems, we combine the idea of localization with subgradient information and propose a framework to design stochastic cutting-plane methods, whose expected simulation costs have a low dependence on the scale and the dimension of the problems. In addition, utilizing the discrete nature of the problems, we propose a stochastic dimension-reduction algorithm, which does not require prior information about the Lipschitz constant of the objective function, and its simulation costs are upper bounded by a value that is independent of the Lipschitz constant. We implement the proposed algorithms on synthetic problems and queueing simulation-optimization problems and demonstrate better performances compared with benchmark methods especially for large-scale examples.
Suggested Citation
Haixiang Zhang & Zeyu Zheng & Javad Lavaei, 2025.
"Stochastic Localization Methods for Convex Discrete Optimization via Simulation,"
Operations Research, INFORMS, vol. 73(2), pages 927-948, March.
Handle:
RePEc:inm:oropre:v:73:y:2025:i:2:p:927-948
DOI: 10.1287/opre.2022.0030
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:oropre:v:73:y:2025:i:2:p:927-948. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.