en
0.25
0.5
0.75
1.25
1.5
1.75
2
Minimum Neighbor Distance Estimators of Intrinsic Dimension
Published on Nov 30, 20112560 Views
Most of the machine learning techniques suffer the "curse of dimensionality" effect when applied to high dimensional data. To face this limitation, a common preprocessing step consists in employing a
Related categories
Chapter list
Minimum Neighbor Distance Estimators of Intrinsic Dimension00:00
Outline00:13
Motivation - 100:34
Motivation - 200:39
Motivation - 300:44
Motivation - 400:58
Motivation - 501:10
Motivation - 601:16
Motivation - 701:21
Applications - 101:29
Applications - 201:36
Applications - 301:41
Problems arising with dimensionality - 101:50
Problems arising with dimensionality - 202:07
Problems arising with dimensionality - 302:23
Problems arising with dimensionality - 402:40
Dimensionality estimation algorithms - 103:18
Dimensionality estimation algorithms - 203:37
Dimensionality estimation algorithms - 303:45
Some state of the art techniques - 104:05
Some state of the art techniques - 204:11
Some state of the art techniques - 304:20
Some state of the art techniques - 404:38
Some considerations - 104:43
Some considerations - 204:57
Some considerations - 305:00
Some considerations - 405:04
Some considerations - 505:10
Some considerations - 605:16
Some considerations - 705:24
Our approach - 105:37
Our approach - 205:50
Our approach - 305:56
Our approach - 406:08
Our approach - 506:20
Local uniformity - 106:36
Local uniformity - 206:55
A log-likelihood function - 107:06
A log-likelihood function - 207:17
A log-likelihood function - 307:26
A log-likelihood function - 407:39
Log-likelihood - 107:53
Log-likelihood - 207:57
Log-likelihood - 308:14
Log-likelihood - 408:27
pdf comparison - 108:39
pdf comparison - 208:47
pdf comparison - 308:58
pdf comparison - 409:04
pdf comparison - 509:12
MiNDKL - 109:19
MiNDKL - 209:31
MiNDKL - 309:34
MiNDKL - 409:42
Tests - 109:51
Tests - 209:55
Tests - 310:00
Experimental Setting - 110:13
Experimental Setting - 210:21
Results - 110:30
Results - 210:56
Conclusions - 111:41
Conclusions - 211:53
Conclusions - 311:55
Conclusions - 412:04
Future Works - 112:16
Future Works - 212:21
Questions12:32