en
0.25
0.5
0.75
1.25
1.5
1.75
2
Mistake bounds and risk bounds for on-line learning algorithms
Published on Feb 25, 20073134 Views
In statistical learning theory, risk bounds are typically obtained via the manipulation of suprema of empirical processes measuring the largest deviation of the empirical risk from the true risk in a
Related categories
Chapter list
TAIL RISK BOUNDS00:03
STATISTICAL LEARNING THEORY01:54
EXAMPLES05:07
RISK BOUNDS06:26
EXAMPLES07:03
DATA-DEPENDENT VC THEORY07:20
AN ALGORITHM-DEPENDENT THEORY13:22
GOALS18:34
AN ALGORITHM-DEPENDENT THEORY19:30
GOALS19:46
STEP 1: BOUND THE AVERAGE RISK21:03
AN ALGORITHM-DEPENDENT THEORY21:42
STEP 1: BOUND THE AVERAGE RISK21:51
AN ALGORITHM-DEPENDENT THEORY24:17
STEP 1: BOUND THE AVERAGE RISK25:00
BERNSTEIN’S BOUND26:25
APPLICATION OF BERNSTEIN’S BOUND28:17
STEP 2: PICK A GOOD FUNCTION IN THE ENSEMBLE35:09
APPLICATION OF BERNSTEIN’S BOUND35:15
STEP 2: PICK A GOOD FUNCTION IN THE ENSEMBLE35:35
STEP 3: RELATE TO OPTIMAL RISK IN H38:35
MORE EXAMPLES43:31
STEP 3: RELATE TO OPTIMAL RISK IN H43:34
MORE EXAMPLES44:22
APPLICATION OF BERNSTEIN’S BOUND45:42
MORE EXAMPLES46:07
STEP 2: PICK A GOOD FUNCTION IN THE ENSEMBLE47:24
CONCLUSIONS47:42
MORE EXAMPLES49:28
EXPERIMENTS ON RCV1 CORPUS50:13
AVERAGES OVER ALL CATEGORIES50:22
ESTIMATES AFTER 5K DOCUMENTS50:47
ESTIMATES AFTER 10K DOCUMENTS51:04
ESTIMATES AFTER 20K DOCUMENTS51:05
ESTIMATES AFTER 40K DOCUMENTS51:06
ESTIMATES AFTER 80K DOCUMENTS51:07