Asymptotic normality of support vector machine variants and other regularized kernel methods
In nonparametric classification and regression problems, regularized kernel methods, in particular support vector machines, attract much attention in theoretical and in applied statistics. In an abstract sense, regularized kernel methods (simply called SVMs here) can be seen as regularized M-estimators for a parameter in a (typically infinite dimensional) reproducing kernel Hilbert space. For smooth loss functions L, it is shown that the difference between the estimator, i.e. the empirical SVM fL,Dn,λDn, and the theoretical SVM fL,P,λ0 is asymptotically normal with rate n. That is, n(fL,Dn,λDn−fL,P,λ0) converges weakly to a Gaussian process in the reproducing kernel Hilbert space. As common in real applications, the choice of the regularization parameter Dn in fL,Dn,λDn may depend on the data. The proof is done by an application of the functional delta-method and by showing that the SVM-functional P↦fL,P,λ is suitably Hadamard-differentiable.
Volume (Year): 106 (2012)
Issue (Month): C ()
|Contact details of provider:|| Web page: http://www.elsevier.com/wps/find/journaldescription.cws_home/622892/description#description|
|Order Information:|| Postal: http://www.elsevier.com/wps/find/supportfaq.cws_home/regional|
When requesting a correction, please mention this item's handle: RePEc:eee:jmvana:v:106:y:2012:i:c:p:92-117. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (Zhang, Lei)
If references are entirely missing, you can add them using this form.