en
0.25
0.5
0.75
1.25
1.5
1.75
2
The Sample-Computational Tradeoff
Published on Jan 16, 20133934 Views
When analyzing the error of a learning algorithm, it is common to decompose the error into approximation error (measuring how well the hypothesis class fits the problem) and estimation error (due to
Related categories
Chapter list
The Sample-Computational Tradeoff00:00
Based on joint work with:00:00
Agnostic PAC Learning00:42
Error Decomposition02:09
3-term Error Decomposition (Bottou & Bousquet' 08)03:35
Joint Time-Sample Complexity - 104:50
Joint Time-Sample Complexity - 204:59
Joint Time-Sample Complexity - 305:10
Outline - 106:47
Agnostic learning Preferences - 107:22
Agnostic learning Preferences - 211:28
Sample-Computational Tradeoff - 114:13
Is this the best we can do?15:03
Sample-Computational Tradeoff - 215:50
HKS: Proof idea16:40
Outline - 218:33
Learning Margin-Based Halfspaces - 119:08
Learning Margin-Based Halfspaces - 223:25
Proof Idea24:35
Proof Idea (Cont.)25:41
Can we do better ?27:57
Proof ideas29:39
Outline - 333:33
Proof: One Way Permutations35:45
Proof: The Learning Problem - 136:01
Proof: The Learning Problem - 236:02
Proof: The Learning Problem - 336:03
Proof: The Learning Problem - 436:03
Proof of Second Claim - 136:04
Proof of Second Claim - 236:06
Proof of First Claim36:07
Proof of Third Claim - 136:08
Proof of Third Claim - 236:11
Proof of Third Claim - 336:19
Summary36:21