The Limit of One-Class SVM
published: Feb. 25, 2007, recorded: October 2005, views: 9837
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
In this talk, I will present an analysis of the asymptotic behaviour of the One-Class support vector machine (SVM), a popular algorithm for outlier detection. I will show that One-Class SVM asymptotically estimates a truncated version of the density of the distribution generating the data, in the case where the Gaussian kernel is used with a well-calibrated decreasing bandwidth parameter, and the regularization parameter involved in the algorithm is held fixed as the training sample size goes to infinity.A long version of this work can be found at www.lri.fr/vert/Publi/regularizeGaussianKernel.ps , in which extensions to the 2-class case and to more general convex loss functions are considered.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !