
en
0.25
0.5
0.75
1.25
1.5
1.75
2
Mistake bounds and risk bounds for on-line learning algorithms
Published on Feb 4, 20253138 Views
In statistical learning theory, risk bounds are typically obtained via the manipulation of suprema of empirical processes measuring the largest deviation of the empirical risk from the true risk in a
Related categories
Presentation
TAIL RISK BOUNDS00:03
STATISTICAL LEARNING THEORY01:54
RISK BOUNDS06:26
EXAMPLES07:03
DATA-DEPENDENT VC THEORY07:20
AN ALGORITHM-DEPENDENT THEORY13:22
GOALS18:34
STEP 1: BOUND THE AVERAGE RISK21:03
BERNSTEIN’S BOUND26:25
APPLICATION OF BERNSTEIN’S BOUND28:17
STEP 2: PICK A GOOD FUNCTION IN THE ENSEMBLE35:09
STEP 3: RELATE TO OPTIMAL RISK IN H38:35
MORE EXAMPLES43:31
CONCLUSIONS47:42
EXPERIMENTS ON RCV1 CORPUS50:13
AVERAGES OVER ALL CATEGORIES50:22
ESTIMATES AFTER 5K DOCUMENTS50:47
ESTIMATES AFTER 10K DOCUMENTS51:04
ESTIMATES AFTER 20K DOCUMENTS51:05
ESTIMATES AFTER 40K DOCUMENTS51:06
ESTIMATES AFTER 80K DOCUMENTS51:07