IDEAS home Printed from https://ideas.repec.org/h/spr/spochp/978-3-030-28565-4_14.html
   My bibliography  Save this book chapter

How Effectively Train Large-Scale Machine Learning Models?

In: Optimization in Large Scale Problems

Author

Listed:
  • Aven Samareh

    (University of Washington)

  • Mahshid Salemi Parizi

    (University of Washington)

Abstract

The stochastic gradient method (SGM)Stochastic gradient method (SGM) is widely used as an optimization tool in many machine learning applications including support vector machines (SVM)s, Support vector machines (SVM) logistic regression, graphical models and deep learning. SGM computes the estimates of the gradient from a single randomly chosen sample in each iteration. Therefore, applying a stochastic gradient method for large-scale machine learning problems can be computationally efficient. In this work, we focus on generating generalization bounds for a randomized algorithm Algorithm such as Random Fourier features learned with stochastic gradient descent algorithm. Our findings are based on a mutual relationship between the generalization error of an algorithm and its stability Stability properties. The stability of an algorithm is measured by the generalization error Generalization error regarding the absolute difference between the testing and the training error. Training error Overall, an algorithm is called stable if by changing any single training data Training data point the training error varies slightly. In this work, we measured the stability of stochastic gradient method (SGM) for learning an approximated Fourier primal support vector machine. In particular, under certain regularity assumptions, we showed that a randomized algorithm such as Random Fourier feature where is trained by a stochastic gradient method (SGM) with few iterations has vanishing generalization error. Therefore, the iterative optimization algorithm can stop long before its convergence to reduce computational cost. We empirically verified the theoretical findings for different parameters using several data sets.

Suggested Citation

  • Aven Samareh & Mahshid Salemi Parizi, 2019. "How Effectively Train Large-Scale Machine Learning Models?," Springer Optimization and Its Applications, in: Mahdi Fathi & Marzieh Khakifirooz & Panos M. Pardalos (ed.), Optimization in Large Scale Problems, pages 97-110, Springer.
  • Handle: RePEc:spr:spochp:978-3-030-28565-4_14
    DOI: 10.1007/978-3-030-28565-4_14
    as

    Download full text from publisher

    To our knowledge, this item is not available for download. To find whether it is available, there are three options:
    1. Check below whether another version of this item is available online.
    2. Check on the provider's web page whether it is in fact available.
    3. Perform a search for a similarly titled item that would be available.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:spochp:978-3-030-28565-4_14. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    We have no bibliographic references for this item. You can help adding them by using this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.