IDEAS home Printed from https://ideas.repec.org/a/inm/ormoor/v50y2025i2p1364-1397.html
   My bibliography  Save this article

Variance-Reduced Accelerated First-Order Methods: Central Limit Theorems and Confidence Statements

Author

Listed:
  • Jinlong Lei

    (Department of Control Science and Engineering, Tongji University, Shanghai 201804, China; and Shanghai Research Institute for Intelligent Autonomous Systems, Tongji University, Shanghai 201804, China)

  • Uday V. Shanbhag

    (Department of Industrial and Manufacturing Engineering, Pennsylvania State University, University Park, Pennsylvania 16802)

Abstract

In this paper, we consider a strongly convex stochastic optimization problem and propose three classes of variable sample-size stochastic first-order methods: (i) the standard stochastic gradient descent method, (ii) its accelerated variant, and (iii) the stochastic heavy-ball method. In each scheme, the exact gradients are approximated by averaging across an increasing batch size of sampled gradients. We prove that when the sample size increases at a geometric rate, the generated estimates converge in mean to the optimal solution at an analogous geometric rate for schemes (i)–(iii). Based on this result, we provide central limit statements, whereby it is shown that the rescaled estimation errors converge in distribution to a normal distribution with the associated covariance matrix dependent on the Hessian matrix, the covariance of the gradient noise, and the step length. If the sample size increases at a polynomial rate, we show that the estimation errors decay at a corresponding polynomial rate and establish the associated central limit theorems (CLTs). Under certain conditions, we discuss how both the algorithms and the associated limit theorems may be extended to constrained and nonsmooth regimes. Finally, we provide an avenue to construct confidence regions for the optimal solution based on the established CLTs and test the theoretical findings on a stochastic parameter estimation problem.

Suggested Citation

  • Jinlong Lei & Uday V. Shanbhag, 2025. "Variance-Reduced Accelerated First-Order Methods: Central Limit Theorems and Confidence Statements," Mathematics of Operations Research, INFORMS, vol. 50(2), pages 1364-1397, May.
  • Handle: RePEc:inm:ormoor:v:50:y:2025:i:2:p:1364-1397
    DOI: 10.1287/moor.2021.0068
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/moor.2021.0068
    Download Restriction: no

    File URL: https://libkey.io/10.1287/moor.2021.0068?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormoor:v:50:y:2025:i:2:p:1364-1397. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.