A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery

author: Arthur Gretton, Centre for Computational Statistics and Machine Learning, University College London
published: Aug. 25, 2008,   recorded: July 2008,   views: 4870


Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.


In recent work by (Song et al., 2007), it has been proposed to perform clustering by maximizing a Hilbert-Schmidt independence criterion with respect to a predefined cluster structure Y, by solving for the partition matrix. We extend this approach here to the case where the cluster structure Y is not fixed, but is a quantity to be optimized and we use an independence criterion which has been shown to be more sensitive at small sample sizes (the Hilbert-Schmidt Normalized Information Criterion, or HSNIC (Fukumizu et al., 2008)). We demonstrate the use of this framework in two scenarios. In the first, we adopt a cluster structure selection approach in which the HSNIC is used to select a structure from several candidates. In the second, we consider the case where we discover structure by directly optimizing Y.

See Also:

Download slides icon Download slides: mlg08_gretton_ahsdma_01.pdf (2.1┬áMB)

Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Reviews and comments:

Comment1 Lei, September 24, 2008 at 10:53 a.m.:

Nice presentation. Except I didn't get the 4-point constraint. Looking forward to seeing the paper or the references.

Write your own review or comment:

make sure you have javascript enabled or clear this field: