en-de
en-es
en-fr
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
Information Geometry
Published on Feb 25, 200735522 Views
This tutorial will focus on entropy, exponential families, and information projection. We'll start by seeing the sense in which entropy is the only reasonable definition of randomness. We will then us
Related categories
Chapter list
Information geometry00:04
Learning a distribution00:14
Outline02:29
Part I: Entropy02:56
Formulating the problem03:01
What is randomness?05:11
Entropy07:37
Entropy is concave10:41
Properties of entropy11:19
Additivity13:07
Properties of entropy, cont’d14:01
KL divergence16:04
Entropy and KL divergence18:24
Another justification of entropy19:53
Asymptotic equipartition21:27
AEP: examples22:55
Proof of AEP24:40
Back to our main question26:53
Maximum entropy27:51
Alternative formulation30:52
A projection operation31:43
Solution by calculus35:03
Form of the solution37:10
Part II: Exponential families38:50
Exponential families39:06
Natural parameter space41:35
Example: Bernoulli43:17
Parametrization of Bernoulli46:09
Example: Poisson47:21
Example: Gaussian49:43
Properties of exponential families51:41
Maximum likelihood estimation54:18
Maximum likelihood, cont’d56:44
Our toy problem58:05
The two spaces58:53
Part III: Information projection01:00:20
Back to maximum entropy01:00:32
Maximum entropy example01:02:11
Maximum entropy: restatement01:04:08
Proof01:05:11
Geometric interpretation01:08:36