Confluence between Kernel Methods and Graphical Models

Confluence between Kernel Methods and Graphical Models

7 Lectures · Dec 8, 2012

About

Kernel methods and graphical models are two important families of techniques for machine learning. Our community has witnessed many major but separate advances in the theory and applications of both subfields. For kernel methods, the advances include kernels on structured data, Hilbert-space embeddings of distributions, and applications of kernel methods to multiple kernel learning, transfer learning, and multi-task learning. For graphical models, the advances include variational inference, nonparametric Bayes techniques, and applications of graphical models to topic modeling, computational biology and social network problems.

This workshop addresses two main research questions: first, how may kernel methods be used to address difficult learning problems for graphical models, such as inference for multi-modal continuous distributions on many variables, and dealing with non-conjugate priors? And second, how might kernel methods be advanced by bringing in concepts from graphical models, for instance by incorporating sophisticated conditional independence structures, latent variables, and prior information?

Kernel algorithms have traditionally had the advantage of being solved via convex optimization or eigenproblems, and having strong statistical guarantees on convergence. The graphical model literature has focused on modelling complex dependence structures in a flexible way, although approximations may be reuqired to make inference tractable. Can we develop a new set of methods which blend these strengths?

There has recently been a number of publications combining kernel and graphical model techniques, including kernel hidden Markov models, kernel belief propagation, kernel Bayes rule, kernel topic models, kernel variational inference, kernel herding as Bayesian quadrature, kernel beta processes, and a connection between kernel k-means and Bayesian nonparametrics. Each of these results deals with different inference tasks, and makes use of a range of RKHS propreties. We propose this workshop so as to "connect the dots" and develop a unified toolkit to address a broad range of learning problems, to the mutual benefit of reseachers in kernels and graphical models. The goals of the workshop are thus twofold: first, to provide an accessible review and synthesis of recent results combining graphical models and kernels. Second, to provide a discussion forum for open problems and technical challenges.

Workshop homepage: https://sites.google.com/site/kernelgraphical/

Related categories

Uploaded videos:

Invited Talks

video-img
25:03

The Kernel Beta Process

Lawrence Carin

Jan 16, 2013

 · 

3111 Views

Invited Talk
video-img
27:58

Kernel Topic Models

Thore Graepel

Jan 16, 2013

 · 

3576 Views

Invited Talk
video-img
29:12

Nonparametric Variational Inference

Matt Hoffman

Jan 16, 2013

 · 

4671 Views

Invited Talk
video-img
32:33

Determinantal Point Processes

Ben Taskar

Jan 23, 2013

 · 

6790 Views

Invited Talk
video-img
22:22

Bayesian Interpretations of RKHS Embedding Methods

David Kristjanson Duvenaud

Jan 16, 2013

 · 

4566 Views

Invited Talk

Contributed Talks

video-img
14:09

Hilbert Space Embedding for Dirichlet Process Mixtures

Krikamol Muandet

Jan 18, 2013

 · 

3831 Views

Lecture
video-img
11:52

Kernels for Protein Structure Prediction

Narges Sharif-Razavian

Jan 18, 2013

 · 

2997 Views

Lecture