en-de
en-es
en-fr
en-sl
en
en-zh
0.25
0.5
0.75
1.25
1.5
1.75
2
Identifying Feature Relevance using a Random Forest
Published on Feb 25, 200712614 Views
Many feature selection algorithms are limited in that they attempt to identify relevant feature subsets by examining the features individually. This paper introduces a technique for determining featur
Related categories
Chapter list
Identifying Feature Relevance Using a Random Forest00:00
Overview00:19
Random Forest01:17
Random Forest (cont...)01:46
Feature Relevance: Ranking03:05
Feature Relevance: Subset Methods04:19
Relevance Identification using Average Information Gain05:56
Node Complexity Compensation07:18
Unique & Non-Unique Arrangements08:01
Node Complexity Compensation (cont…)08:49
Information Gain Density Functions09:43
Information Gain Density Functions10:34
Employing Feature Relevance11:25
Parallel12:51
Convergence Rates14:08
Results15:03
Irrelevant Features16:19
Expected Information Gain16:56
Expected Information Gain17:54
Bounds on Expected Information Gain18:56
Irrelevant Features: Bounds19:39
Friedman20:13
Simple21:42
Results22:08
Summary23:49