A Hilbert-Schmidt Dependence Maximization Approach to Unsupervised Structure Discovery
published: Aug. 25, 2008, recorded: July 2008, views: 446
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
In recent work by (Song et al., 2007), it has been proposed to perform clustering by maximizing a Hilbert-Schmidt independence criterion with respect to a predefined cluster structure Y, by solving for the partition matrix. We extend this approach here to the case where the cluster structure Y is not fixed, but is a quantity to be optimized and we use an independence criterion which has been shown to be more sensitive at small sample sizes (the Hilbert-Schmidt Normalized Information Criterion, or HSNIC (Fukumizu et al., 2008)). We demonstrate the use of this framework in two scenarios. In the first, we adopt a cluster structure selection approach in which the HSNIC is used to select a structure from several candidates. In the second, we consider the case where we discover structure by directly optimizing Y.
Download slides: mlg08_gretton_ahsdma_01.pdf (2.1 MB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !