Statistical Learning Theory

author: Olivier Bousquet, Google, Inc.
published: Feb. 25, 2007,   recorded: August 2003,   views: 4552
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

 Watch videos:   (click on thumbnail to launch)

Watch Part 1
Part 1 45:29
!NOW PLAYING
Watch Part 2
Part 2 45:35
!NOW PLAYING
Watch Part 3
Part 3 46:02
!NOW PLAYING
Watch Part 4
Part 4 47:25
!NOW PLAYING
Watch Part 5
Part 5 49:43
!NOW PLAYING
Watch Part 6
Part 6 44:20
!NOW PLAYING
Watch Part 7
Part 7 39:31
!NOW PLAYING

Description

This course will give a detailed introduction to learning theory with a focus on the classification problem. It will be shown how to obtain (pobabilistic) bounds on the generalization error for certain types of algorithms. The main themes will be: * probabilistic inequalities and concentration inequalities * union bounds, chaining * measuring the size of a function class, Vapnik Chervonenkis dimension, shattering dimension and Rademacher averages * classification with real-valued functions  Some knowledge of probability theory would be helpful but not required since the main tools will be introduced.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Reviews and comments:

Comment1 ibrahim güney, March 21, 2007 at 11:39 a.m.:

good


Comment2 clueless, December 29, 2007 at 11:49 p.m.:

Doesn't work on Mac!


Comment3 hardlianotion, July 26, 2008 at 6:17 p.m.:

Does work on Mac!

Write your own review or comment:

make sure you have javascript enabled or clear this field: