Asymptotic normality of support vector machine variants and other regularized kernel methods
In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions L, it is shown that the difference between the estimator, i.e. the empirical SVM fL,Dn,λDn, and the theoretical SVM fL,P,λ0 is asymptotically normal with rate n. That is, n(fL,Dn,λDn−fL,P,λ0) converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter Dn in fL,Dn,λDn may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional P↦fL,P,λ is suitably Hadamard-differentiable.
Volume (Year): 106 (2012)
Issue (Month): C ()
|Contact details of provider:|| Web page: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description|
|Order Information:|| Postal: http://www.elsevier.com/wps/find/supportfaq.cws_home/regional|
References listed on IDEAS
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Hable, Robert & Christmann, Andreas, 2011. "On qualitative robustness of support vector machines," Journal of Multivariate Analysis, Elsevier, vol. 102(6), pages 993-1007, July.
When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:106:y:2012:i:c:p:92-117. See general information about how to correct material in RePEc.
If references are entirely missing, you can add them using this form.