Consistency and robustness of kernel based regression
We investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y . We then establish a consistency result for KBR and give assumptions for the existence of the influence function. In particular, our results allow to choose the loss function and the kernel to obtain computational tractable and consistent KBR methods having bounded influence functions. Furthermore, bounds for the sensitivity curve which is a finite sample version of the influence function are developed, and some numerical experiments are discussed.
|Date of creation:||2005|
|Date of revision:|
|Contact details of provider:|| Postal: Vogelpothsweg 78, D-44221 Dortmund|
Phone: (0231) 755-3125
Fax: (0231) 755-5284
Web page: http://www.statistik.tu-dortmund.de/sfb475.html
More information through EDIRC
Please report citation or reference errors to , or , if you are the registered author of the cited work, log in to your RePEc Author Service profile, click on "citations" and make appropriate adjustments.:
- Struyf, Anja J. & Rousseeuw, Peter J., 1999. "Halfspace Depth and Regression Depth Characterize the Empirical Distribution," Journal of Multivariate Analysis, Elsevier, vol. 69(1), pages 135-153, April.
When requesting a correction, please mention this item's handle: RePEc:zbw:sfb475:200501. See general information about how to correct material in RePEc.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: (ZBW - German National Library of Economics)
If references are entirely missing, you can add them using this form.