Sparsity analsysis of term weighting schemes and application to text classification
published: Feb. 25, 2007, recorded: February 2005, views: 137
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
We revisit the common practice of feature selection for dimensionality and noise reduction. This typically involves scoring and ranking features based on some weighting scheme and selecting top ranked features for further processing. Experiments show that the performance of text classification methods is sensitive to characteristics of the used feature sets. For example, the size of the feature sets that yield the same performance level for a given classification method can be very different, depending on the feature scoring method used. We expand this exploration by considering representations of individual document vectors that result from a particular feature set. In particular, we observe the average number of features per document vector, i.e., the vector sparsity, or density and introduce sparsity curves to illustrate how the vector density increases with the feature set for different weighting schemes. We show that selecting feature by specifying the vector density parameter, instead of a feature set size, yields comparable results to the commonly adopted practice. However, it has the added benefit of understanding the effect of feature selection on document vector representation and system parameters, such as memory consumption of the classification operations. Furthermore, the corresponding classification performance curves link the sparsity and performance measures and provide further insight on how the feature specificity or distribution of the feature across documents in the corpus, is accounted for by the classification method.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !