Linear Discriminant Dimensionality Reduction

produced by: Data & Web Mining Lab
author: Jiawei Han, Department of Computer Science, University of Illinois at Urbana-Champaign
published: Nov. 30, 2011,   recorded: September 2011,   views: 3781

See Also:

Download slides icon Download slides: ecmlpkdd2011_han_dimensionality_01.pdf (547.0┬áKB)

Help icon Streaming Video Help

Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.


Fisher criterion has achieved great success in dimensionality reduction. Two representative methods based on Fisher criterion are Fisher Score and Linear Discriminant Analysis (LDA). The former is developed for feature selection while the latter is designed for subspace learning. In the past decade, these two approaches are often studied independently. In this paper, based on the observation that Fisher score and LDA are complementary, we propose to integrate Fisher score and LDA in a unified framework, namely Linear Discriminant Dimensionality Reduction (LDDR). We aim at finding a subset of features, based on which the learnt linear transformation via LDA maximizes the Fisher criterion. LDDR inherits the advantages of Fisher score and LDA and is able to do feature selection and subspace learning simultaneously. Both Fisher score and LDA can be seen as the special cases of the proposed method. The resultant optimization problem is a mixed integer programming, which is difficult to solve. It is relaxed into a L2,1-norm constrained least square problem and solved by accelerated proximal gradient descent algorithm. Experiments on benchmark face recognition data sets illustrate that the proposed method outperforms the state of the art methods arguably.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: