IDEAS home Printed from https://ideas.repec.org/a/inm/oropre/v73y2025i3p1581-1597.html
   My bibliography  Save this article

Slowly Varying Regression Under Sparsity

Author

Listed:
  • Dimitris Bertsimas

    (Sloan School of Management, Massachusetts Institute of Technology, Cambridge, Massachusetts 02142)

  • Vassilis Digalakis

    (Department of Information Systems and Operations Management, HEC Paris, 78350 Jouy-en-Josas, France)

  • Michael Lingzhi Li

    (Technology and Operations Management Unit, Harvard Business School, Boston, Massachusetts 02163)

  • Omar Skali Lami

    (McKinsey & Company, Boston, Massachusetts 02210)

Abstract

We introduce the framework of slowly varying regression under sparsity, which allows sparse regression models to vary slowly and sparsely. We formulate the problem of parameter estimation as a mixed-integer optimization problem and demonstrate that it can be reformulated exactly as a binary convex optimization problem through a novel relaxation. The relaxation utilizes a new equality on Moore-Penrose inverses that convexifies the nonconvex objective function while coinciding with the original objective on all feasible binary points. This allows us to solve the problem significantly more efficiently and to provable optimality using a cutting plane–type algorithm. We develop a highly optimized implementation of such algorithm, which substantially improves upon the asymptotic computational complexity of a straightforward implementation. We further develop a fast heuristic method that is guaranteed to produce a feasible solution and, as we empirically illustrate, generates high-quality warm-start solutions for the binary optimization problem. To tune the framework’s hyperparameters, we propose a practical procedure relying on binary search that, under certain assumptions, is guaranteed to recover the true model parameters. We show, on both synthetic and real-world data sets, that the resulting algorithm outperforms competing formulations in comparable times across a variety of metrics, including estimation accuracy, predictive power, and computational time, and is highly scalable, enabling us to train models with tens of thousands of parameters.

Suggested Citation

  • Dimitris Bertsimas & Vassilis Digalakis & Michael Lingzhi Li & Omar Skali Lami, 2025. "Slowly Varying Regression Under Sparsity," Operations Research, INFORMS, vol. 73(3), pages 1581-1597, May.
  • Handle: RePEc:inm:oropre:v:73:y:2025:i:3:p:1581-1597
    DOI: 10.1287/opre.2022.0330
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/opre.2022.0330
    Download Restriction: no

    File URL: https://libkey.io/10.1287/opre.2022.0330?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:oropre:v:73:y:2025:i:3:p:1581-1597. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.