published: Feb. 25, 2007, recorded: February 2006, views: 20925
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Watch videos: (click on thumbnail to launch)
In this introductory course we will discuss how log linear models can be extended to feature space. These log linear models have been studied by statisticians for a long time under the name of exponential family of probability distributions. We provide a unified framework which can be used to view many existing kernel algorithms as special cases. Our framework also allows us to derive many natural generalizations of existing algorithms. In particular, we show how we can recover Gaussian Processes, Support Vector Machines, multi-class discrimination, and sequence annotation (via Conditional Random Fields). We also show to deal with missing data and perform MAP estimation on Conditional Random Fields in feature space. The requisite background for the course will be covered briskly in the first two lectures. Knowledge of linear algebra and familiarity with functional analysis will be helpful.
Download slides: mlss06au_smola_ef.pdf (1.2 MB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !
Reviews and comments:
Really nice talk! Thanks for uploading.
And great cameraman!
this lecture is very helpful 4 me.thnx
Write your own review or comment: