Infinite Kernel Learning
published: Dec. 20, 2008, recorded: December 2008, views: 5010
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
In this paper we build upon the Multiple Kernel Learning (MKL) framework. We rewrite the problem in the standard MKL formulation which leads to a Semi-Inﬁnite Program. We devise a new algorithm to solve it (Inﬁnite Kernel Learning, IKL). The IKL algorithm is applicable to both the ﬁnite and inﬁnite case and we ﬁnd it to be faster and more stable than SimpleMKL. Furthermore we present the ﬁrst large scale comparison of SVMs to MKL on a variety of benchmark datasets, also comparing IKL. The results show two things: a) for many datasets there is no beneﬁt in using MKL/IKL instead of the SVM classiﬁer, thus the ﬂexibility of using more than one kernel seems to be of no use, b) on some datasets IKL yields massive increases in accuracy over SVM/MKL due to the possibility of using a largely increased kernel set. For those cases parameter selection through Cross-Validation or MKL is not applicable.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !