en
0.25
0.5
0.75
1.25
1.5
1.75
2
Safe Learning: bridging the gap between Bayes, MDL and statistical learning theory via empirical convexity
Published on Aug 02, 20113751 Views
We extend Bayesian MAP and Minimum Description Length (MDL) learning by testing whether the data can be substantially more compressed by a mixture of the MDL/MAP distribution with another element of
Related categories
Chapter list
Safe Learning00:00
Two Seemingly Different Problems - 100:09
Two Seemingly Different Problems - 203:06
Basic 2-part code MDL04:28
Convergence of 2-MDL - 106:14
Convergence of 2-MDL - 207:47
First Insight08:51
Bad and Good Misspecification - 109:16
Bad and Good Misspecification - 209:34
Bad and Good Misspecification - 309:39
Can we test (tell from the data) wether we are in the bad situation?10:38
YES: we can test wether it’s bad!11:27
Can we adjust model or priors to “repair” situation? - 112:34
Can we adjust model or priors to “repair” situation? - 214:12
YES: we can adjust models/priors to bad situation!14:58
Safe Estimation - 115:53
Safe Estimation - 216:24
Safe Estimation - 316:27
Safe Estimation - 417:08
Main Result - 117:39
Main Result - 218:19
Second Result: What Actually Happens19:04
Classification! - 119:44
Classification! - 220:19
Classification! - 320:53
Final Remarks22:07