PLAL: Cluster-based active learning thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

PLAL: Cluster-based active learning

Published on Aug 09, 20133610 Views

We investigate the label complexity of active learning under some smoothness assumptions on the data-generating process.We propose a procedure, PLAL, for “activising” passive, sample-based learners. T

Related categories

Chapter list

PLAL: cluster-based active learning00:00
Standard Statistical Learning framework00:19
Active Learning (AL)00:39
Formal model for Active Learning01:19
Challenges of Active Learning01:35
Previous work on Active Learning02:52
Previous work on AL: Cluster-based - 104:07
Previous work on AL: Cluster-based - 205:17
Previous work on AL: Cluster-based - 305:25
Our Contributions05:46
Convert DH framework to an algorithm: PLAL06:27
Overview - 107:06
Error bound07:50
Overview - 208:16
Number of queries depends on clusterability08:25
Cluster assumption and Lipschitzness09:06
The Probabilistic Lipschitzness assumption09:36
PL examples10:45
Example–Smoothly clustered data11:00
General bound on the number of queries - 111:32
General bound on the number of queries - 212:18
Bound for dyadic trees13:30
Overview - 314:17
Using PLAL as a pre-procedure14:30
Robustness of Algorithms15:06
ERM and RLM algorithms are robust15:33
Use PLAL for Statistical Algorithms16:01
Use PLAL for Nearest Neighbor learning16:11
Can use PLAL with a modified Nearest Neighbor algorithm16:40
Overview - 416:42
Reductions in label complexity16:53
Reductions in label complexity of learning17:20
Summary17:55