Lecture 2: Entropy and Data Compression (I):  Introduction to Compression, Information Theory and Entropy thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Lecture 2: Entropy and Data Compression (I): Introduction to Compression, Information Theory and Entropy

Published on Nov 05, 201236504 Views

Related categories

Chapter list

Data Compression: How To Measure Information Content 00:00
A Noisy Channel (1)00:22
The Binary Symmetric Channel00:30
Shannon's Noisy Channel Coding Theorem (1)01:08
Predict The Missing Text (1)02:48
Predict The Missing Text (2)04:50
A Noisy Channel (2)05:00
A Noisy Channel (3)05:52
A Bent Coin (1)07:11
A Bent Coin (2)07:18
An Ensemble (1)09:06
An Ensemble (2)10:19
Shannon Information Content (1)11:06
Shannon Information Content (2)12:30
Shannon Information Content (3)14:23
Shannon Information Content (4)15:32
Shannon Information Content (5)17:18
Entropy (1)19:09
The Weighing Problem (1)20:42
The Weighing Problem (2)21:12
The Weighing Problem (3)22:50
The Weighing Problem (4)24:19
The Weighing Problem (5)24:58
The Weighing Problem (6)25:12
The Weighing Problem (7)27:08
The Weighing Problem (8)28:37
The Weighing Problem (10)29:53
The Weighing Problem (11)37:03
Entropy Maximization (1)38:35
Entropy Maximization (2)39:25
Entropy Maximization (3)40:16
The Bent Coin Lottery46:46
Homework49:36