Various Formulations for Learning the Kernel and Structured Sparsity
published: Jan. 12, 2011, recorded: December 2010, views: 4168
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
I will review an approach to learning the kernel, which consists in minimizing a convex objective function over a prescribed set of kernel matrices. I will establish some important properties of this problem and present a reformulation of it from a feature space perspective. A well studied example covered by this setting is multiple kernel learning, in which the set of kernels is the convex hull of a finite set of basic kernels. I will discuss extensions of this setting to more complex kernel families, which involve additional constraints and a continuous parametrization. Some of these examples are motivated by multi-task learning and structured sparsity, which I will describe in some detail during the talk.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !