Sparsity analsysis of term weighting schemes and application to text classification thumbnail
slide-image
Pause
Mute
Subtitles not available
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Sparsity analsysis of term weighting schemes and application to text classification

Published on Feb 25, 20073437 Views

We revisit the common practice of feature selection for dimensionality and noise reduction. This typically involves scoring and ranking features based on some weighting scheme and selecting top ranked

Related categories

Chapter list

Sparsity Analysis of Term Weighting Schemes and Application to Text Classification00:01
Introduction00:20
Feature Weighting Schemes02:05
Feature Weighting Schemes04:51
Feature Weighting Schemes05:42
Characterization of Feature Rankings in terms of Sparsity08:12
Sparsity Curves11:02
Sparsity as the independent variable16:09
Performance as a function of the number of features (Naïve Bayes, 16 categories of RCV2)18:23
Performance as a function of sparsity20:46
Sparsity as a cutoff criterion21:56
Results24:06
Conclusions25:56
Future work26:32
Performance as a function of the number of features (Naïve Bayes, 16 categories of RCV2)28:55
Sparsity Curves30:10