en-de
en-es
en-fr
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
Neighbourhood Components Analysis and Metric Learning
Published on Feb 25, 200713580 Views
Say you want to do K-Nearest Neighbour classification. Besides selecting K, you also have to chose a distance function, in order to define ”nearest”. I’ll talk about a method for learning – from the
Related categories
Chapter list
Learning Quadratic Metrics For Classification00:00
DistanceMetric Learning00:42
Basic Classifiers Perform Annoyingly Well02:43
Instance/Memory Based Classification03:25
Problems with Semi-Parametric Classification05:00
Link with Feature Extraction/Data Transformation05:43
Cross Validation for Metric Learning?06:54
Cross-Validation Performance is Hard to Optimize08:12
Stochastic Neighbour Selection09:20
Expected Leave-One-Out Error11:17
Quadratic Metrics Ì Linear TransformsQuadratic Metrics - Linear Transforms13:07
Optimizing Expected Performance14:55
Neighbourhood Components Analysis17:54
Scale of Transformation A is also learned19:27
Low RankMetric -Nonsquare A20:28
Illustration: Concentric Rings23:32
Face Data25:27
Related Objective Functions26:23
Geometric Intuition-Class Collapsing28:16
Maximally Collapsing Metrics29:11
MCMLearning is a Convex Optimization Problem30:54
Relationship to Fisher’s Discriminant32:05
Learning Low-Rank CollapsingMetrics33:22
Results34:55
Results36:27