IDEAS home Printed from https://ideas.repec.org/a/inm/ormnsc/v20y1974i5p845-862.html
   My bibliography  Save this article

Self-Scaling Variable Metric (SSVM) Algorithms

Author

Listed:
  • Shmuel S. Oren

    (Xerox Corporation, Palo Alto, California and Stanford University)

  • David G. Luenberger

    (Stanford University)

Abstract

A new criterion is introduced for comparing the convergence properties of variable metric algorithms, focusing on stepwise descent properties. This criterion is a bound on the rate of decrease in the function value at each iterative step (single-step convergence rate). Using this criterion as a basis for algorithm development leads to the introduction of variable coefficients to rescale the objective function at each iteration, and, correspondingly, to a new class of variable metric algorithms. Effective scaling can be implemented by restricting the parameters in a two-parameter family of variable metric algorithms. Conditions are derived for these parameters that guarantee monotonic improvement in the single-step convergence rate. These conditions are obtained by analyzing the eigenvalue structure of the associated inverse Hessian approximations.

Suggested Citation

  • Shmuel S. Oren & David G. Luenberger, 1974. "Self-Scaling Variable Metric (SSVM) Algorithms," Management Science, INFORMS, vol. 20(5), pages 845-862, January.
  • Handle: RePEc:inm:ormnsc:v:20:y:1974:i:5:p:845-862
    DOI: 10.1287/mnsc.20.5.845
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/mnsc.20.5.845
    Download Restriction: no

    File URL: https://libkey.io/10.1287/mnsc.20.5.845?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Nataj, Sarah & Lui, S.H., 2020. "Superlinear convergence of nonlinear conjugate gradient method and scaled memoryless BFGS method based on assumptions about the initial point," Applied Mathematics and Computation, Elsevier, vol. 369(C).
    2. S. Cipolla & C. Di Fiore & P. Zellini, 2020. "A variation of Broyden class methods using Householder adaptive transforms," Computational Optimization and Applications, Springer, vol. 77(2), pages 433-463, November.
    3. C. X. Kou & Y. H. Dai, 2015. "A Modified Self-Scaling Memoryless Broyden–Fletcher–Goldfarb–Shanno Method for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 165(1), pages 209-224, April.
    4. Saman Babaie-Kafaki & Reza Ghanbari, 2017. "A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update," 4OR, Springer, vol. 15(1), pages 85-92, March.
    5. Fahimeh Biglari & Farideh Mahmoodpur, 2016. "Scaling Damped Limited-Memory Updates for Unconstrained Optimization," Journal of Optimization Theory and Applications, Springer, vol. 170(1), pages 177-188, July.
    6. M. Al-Baali, 1998. "Numerical Experience with a Class of Self-Scaling Quasi-Newton Algorithms," Journal of Optimization Theory and Applications, Springer, vol. 96(3), pages 533-553, March.
    7. Martin Buhmann & Dirk Siegel, 2021. "Implementing and modifying Broyden class updates for large scale optimization," Computational Optimization and Applications, Springer, vol. 78(1), pages 181-203, January.
    8. Saman Babaie-Kafaki, 2015. "On Optimality of the Parameters of Self-Scaling Memoryless Quasi-Newton Updating Formulae," Journal of Optimization Theory and Applications, Springer, vol. 167(1), pages 91-101, October.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormnsc:v:20:y:1974:i:5:p:845-862. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.