Exponential Families

published: Feb. 25, 2007,   recorded: February 2006,   views: 3158
Categories
You might be experiencing some problems with Your Video player.

Slides

0:01 Slides Exponential Families and Kernels Outline Lecture 1 The Exponential Family Example: Binomial Distribution The Exponential Family Example: Binomial Distribution The Exponential Family Example: Binomial Distribution Example: Binomial Distribution Example: Laplace Distribution Example: Laplace Distribution Example: Normal Distribution Example: Normal Distribution Example: Multinomial Distribution Example: Multinomial Distribution Example: Multinomial Distribution Example: Multinomial Distribution Example: Poisson Distribution Example: Laplace Distribution Example: Poisson Distribution Example: Poisson Distribution Example: Beta Distribution Example: Beta Distribution Example: Gamma Distribution Example: Gamma Distribution Zoology of Exponential Families Recall Benefits: Log-partition function is nice Recall Benefits: Log-partition function is nice Application: Laplace distribution Benefits: Maximum Entropy Estimate Using it Application: Discrete Events Tossing a dice

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.

Part 1 57:48
!NOW PLAYING

Part 2 16:10

Part 3 48:52

Part 4 34:20

Part 5 56:26

Part 8 38:03

Description

In this introductory course we will discuss how log linear models can be extended to feature space. These log linear models have been studied by statisticians for a long time under the name of exponential family of probability distributions. We provide a unified framework which can be used to view many existing kernel algorithms as special cases. Our framework also allows us to derive many natural generalizations of existing algorithms. In particular, we show how we can recover Gaussian Processes, Support Vector Machines, multi-class discrimination, and sequence annotation (via Conditional Random Fields). We also show to deal with missing data and perform MAP estimation on Conditional Random Fields in feature space. The requisite background for the course will be covered briskly in the first two lectures. Knowledge of linear algebra and familiarity with functional analysis will be helpful.

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !