IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v10y2022i7p1029-d777705.html
   My bibliography  Save this article

Communication-Efficient Distributed Learning for High-Dimensional Support Vector Machines

Author

Listed:
  • Xingcai Zhou

    (School of Statistics and Data Science, Nanjing Audit University, Nanjing 211085, China)

  • Hao Shen

    (School of Statistics and Data Science, Nanjing Audit University, Nanjing 211085, China)

Abstract

Distributed learning has received increasing attention in recent years and is a special need for the era of big data. For a support vector machine (SVM), a powerful binary classification tool, we proposed a novel efficient distributed sparse learning algorithm, the communication-efficient surrogate likelihood support vector machine (CSLSVM), in high-dimensions with convex or nonconvex penalties, based on a communication-efficient surrogate likelihood (CSL) framework. We extended the CSL for distributed SVMs without the need to smooth the hinge loss or the gradient of the loss. For a CSLSVM with lasso penalty, we proved that its estimator could achieve a near-oracle property for l 1 penalized SVM estimators on whole datasets. For a CSLSVM with smoothly clipped absolute deviation penalty, we showed that its estimator enjoyed the oracle property, and that it used local linear approximation (LLA) to solve the optimization problem. Furthermore, we showed that the LLA was guaranteed to converge to the oracle estimator, even in our distributed framework and the ultrahigh-dimensional setting, if an appropriate initial estimator was available. The proposed approach is highly competitive with the centralized method within a few rounds of communications. Numerical experiments provided supportive evidence.

Suggested Citation

  • Xingcai Zhou & Hao Shen, 2022. "Communication-Efficient Distributed Learning for High-Dimensional Support Vector Machines," Mathematics, MDPI, vol. 10(7), pages 1-21, March.
  • Handle: RePEc:gam:jmathe:v:10:y:2022:i:7:p:1029-:d:777705
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/10/7/1029/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/10/7/1029/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Xiang Zhang & Yichao Wu & Lan Wang & Runze Li, 2016. "Variable selection for support vector machines in moderately high dimensions," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 78(1), pages 53-76, January.
    2. Fan J. & Li R., 2001. "Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 96, pages 1348-1360, December.
    3. Zou, Hui, 2006. "The Adaptive Lasso and Its Oracle Properties," Journal of the American Statistical Association, American Statistical Association, vol. 101, pages 1418-1429, December.
    4. Zhao, Meng & Kulasekera, K.B., 2006. "Consistent linear model selection," Statistics & Probability Letters, Elsevier, vol. 76(5), pages 520-530, March.
    5. Michael I. Jordan & Jason D. Lee & Yun Yang, 2019. "Communication-Efficient Distributed Statistical Inference," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 114(526), pages 668-681, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Wang, Kangning & Li, Shaomin, 2021. "Robust distributed modal regression for massive data," Computational Statistics & Data Analysis, Elsevier, vol. 160(C).
    2. Hansheng Wang & Bo Li & Chenlei Leng, 2009. "Shrinkage tuning parameter selection with a diverging number of parameters," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(3), pages 671-683, June.
    3. Wei Wang & Shou‐En Lu & Jerry Q. Cheng & Minge Xie & John B. Kostis, 2022. "Multivariate survival analysis in big data: A divide‐and‐combine approach," Biometrics, The International Biometric Society, vol. 78(3), pages 852-866, September.
    4. Julien Chevallier & Dominique Guégan & Stéphane Goutte, 2021. "Is It Possible to Forecast the Price of Bitcoin?," Forecasting, MDPI, vol. 3(2), pages 1-44, May.
    5. Wang, Kangning & Li, Shaomin & Zhang, Benle, 2021. "Robust communication-efficient distributed composite quantile regression and variable selection for massive data," Computational Statistics & Data Analysis, Elsevier, vol. 161(C).
    6. Zhan Liu & Xiaoluo Zhao & Yingli Pan, 2023. "Communication-efficient distributed estimation for high-dimensional large-scale linear regression," Metrika: International Journal for Theoretical and Applied Statistics, Springer, vol. 86(4), pages 455-485, May.
    7. Tutz, Gerhard & Pößnecker, Wolfgang & Uhlmann, Lorenz, 2015. "Variable selection in general multinomial logit models," Computational Statistics & Data Analysis, Elsevier, vol. 82(C), pages 207-222.
    8. Margherita Giuzio, 2017. "Genetic algorithm versus classical methods in sparse index tracking," Decisions in Economics and Finance, Springer;Associazione per la Matematica, vol. 40(1), pages 243-256, November.
    9. Xu, Yang & Zhao, Shishun & Hu, Tao & Sun, Jianguo, 2021. "Variable selection for generalized odds rate mixture cure models with interval-censored failure time data," Computational Statistics & Data Analysis, Elsevier, vol. 156(C).
    10. Emmanouil Androulakis & Christos Koukouvinos & Kalliopi Mylona & Filia Vonta, 2010. "A real survival analysis application via variable selection methods for Cox's proportional hazards model," Journal of Applied Statistics, Taylor & Francis Journals, vol. 37(8), pages 1399-1406.
    11. Ni, Xiao & Zhang, Hao Helen & Zhang, Daowen, 2009. "Automatic model selection for partially linear models," Journal of Multivariate Analysis, Elsevier, vol. 100(9), pages 2100-2111, October.
    12. Peng, Heng & Lu, Ying, 2012. "Model selection in linear mixed effect models," Journal of Multivariate Analysis, Elsevier, vol. 109(C), pages 109-129.
    13. Yize Zhao & Matthias Chung & Brent A. Johnson & Carlos S. Moreno & Qi Long, 2016. "Hierarchical Feature Selection Incorporating Known and Novel Biological Information: Identifying Genomic Features Related to Prostate Cancer Recurrence," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 111(516), pages 1427-1439, October.
    14. G. Aneiros & P. Vieu, 2016. "Sparse nonparametric model for regression with functional covariate," Journal of Nonparametric Statistics, Taylor & Francis Journals, vol. 28(4), pages 839-859, October.
    15. Stephan Brunow & Stefanie Lösch & Ostap Okhrin, 2022. "Labor market tightness and individual wage growth: evidence from Germany," Journal for Labour Market Research, Springer;Institute for Employment Research/ Institut für Arbeitsmarkt- und Berufsforschung (IAB), vol. 56(1), pages 1-21, December.
    16. Hui Xiao & Yiguo Sun, 2020. "Forecasting the Returns of Cryptocurrency: A Model Averaging Approach," JRFM, MDPI, vol. 13(11), pages 1-15, November.
    17. Jun Zhu & Hsin‐Cheng Huang & Perla E. Reyes, 2010. "On selection of spatial linear models for lattice data," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 72(3), pages 389-402, June.
    18. Gareth M. James & Peter Radchenko & Jinchi Lv, 2009. "DASSO: connections between the Dantzig selector and lasso," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 71(1), pages 127-142, January.
    19. Lam, Clifford, 2008. "Estimation of large precision matrices through block penalization," LSE Research Online Documents on Economics 31543, London School of Economics and Political Science, LSE Library.
    20. Ping Wu & Xinchao Luo & Peirong Xu & Lixing Zhu, 2017. "New variable selection for linear mixed-effects models," Annals of the Institute of Statistical Mathematics, Springer;The Institute of Statistical Mathematics, vol. 69(3), pages 627-646, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:10:y:2022:i:7:p:1029-:d:777705. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.