en-de
en-es
en-fr
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
Robustness properties of support vector machines and related methods
Published on Feb 25, 20074882 Views
The talk brings together methods from two disciplines: machine learning theory and robust statistics. We argue that robustness is an important aspect and we show that many existing machine learning me
Related categories
Chapter list
Robustness properties of support vector machines00:29
Convex Risk Minimization01:30
Kernels03:55
Convex Risk Minimization (Vapnik ’98)06:04
Loss functions: classification10:47
Loss functions: regression11:41
Robustness of CRM13:48
Main question16:03
Robustness concepts17:37
Robustness concepts18:40
Comparison20:18
Robustness: classification (Chr & Steinwart, ’04)23:14
L0(zy; fP;¸(zx))©(zx) for KLR with RBF-kernel24:32
Further results (Chr & Steinwart, ’04, ’05)26:31
Regression examples: n=200 data points, skewness27:05
Problem: SVM / CRM are ’non-robust posed problems’31:37
3. Robust Learning from Bites (Chr ’05)35:21
RLB: Robust Learning from Bites36:57
RLB: computational aspects38:15
RLB for kernel methods39:15
RLB for kernel methods: number of support vectors39:43
RLB for kernel methods: L¡risk consistency44:31
Proof of L¡risk consistency44:52
Consistency of RLB45:20
Finite-sample breakdown point of RLB46:02
4. Application: Motor Vehicle Insurance47:42
Statistical objectives48:30
Complex dependencies48:41
Statistical model49:09
Estimation of pure premium E(YjX=x)50:59
Estimation of pure premium E(YjX=x)51:45
RLB based on median: predictions ˆyi for 100 customers54:22
References01:03:40