Dimensionality Reduction by Feature Selection in Machine Learning thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Dimensionality Reduction by Feature Selection in Machine Learning

Published on Feb 25, 200717254 Views

Dimensionality reduction is a commonly used step in machine learning, especially when dealing with a high dimensional space of features. The original feature space is mapped onto a new, reduced dimens

Related categories

Chapter list

Dimensionality Reduction by Feature Selection in Machine Learning00:01
Reasons for dimensionality reduction00:15
Approaches to dimensionality reduction00:59
Example for the problem03:02
Search for feature subset04:05
Feature subset selection04:37
Approaches to feature subset selection05:18
Filtering06:50
Filters: Distribution-based [Koller & Sahami 1996]07:41
Filters: Relief [Kira & Rendell 1992] 08:23
Filters: FOCUS [Almallim & Dietterich 1991] 09:32
Illustration of FOCUS11:00
Filters: Random [Liu & Setiono 1996] 11:37
Filters: MDL-based [Pfahringer 1995] 12:43
Wrapper13:30
Wrappers: Instance-based learning 14:52
Wrappers: Decision tree induction15:42
Metric-based model selection16:48
Embedded18:56
Embedded 19:15
Embedded: in filters [Cardie 1993] 20:19
Simple Filtering21:03
Feature subset selection on text data – commonly used methods21:43
Scoring individual feature25:08
Influence of feature selection on the classification performance28:46
Illustration of feature selection29:41
Illustration on 5 datasets from Yahoo! hierarchy using Naïve Bayes [Mladenic & Grobelnik 2003]30:22
CrossEntropy32:25
Rank of the correct category in the list of all categories F2-measure combining precision and recall emphases on recall Ctgs – number of categories looking promising (testing example needs to be class34:28
Illustration on Reuters-2000 Data [Brank et al 2002]39:18
Experiments with Naïve Bayes Classifier40:03
Average number of nonzero components per vector instead of the overall no. of features41:08
Experiments with Perceptron Classifier41:11
Experiments with the Linear SVM Classifier41:46
Discussion Using discarded features can help42:30
Discussion44:19