Confluence between Kernel Methods and Graphical Models
Kernel methods and graphical models are two important families of techniques for machine learning. Our community has witnessed many major but separate advances in the theory and applications of both subfields. For kernel methods, the advances include kernels on structured data, Hilbert-space embeddings of distributions, and applications of kernel methods to multiple kernel learning, transfer learning, and multi-task learning. For graphical models, the advances include variational inference, nonparametric Bayes techniques, and applications of graphical models to topic modeling, computational biology and social network problems.
This workshop addresses two main research questions: first, how may kernel methods be used to address difficult learning problems for graphical models, such as inference for multi-modal continuous distributions on many variables, and dealing with non-conjugate priors? And second, how might kernel methods be advanced by bringing in concepts from graphical models, for instance by incorporating sophisticated conditional independence structures, latent variables, and prior information?
Kernel algorithms have traditionally had the advantage of being solved via convex optimization or eigenproblems, and having strong statistical guarantees on convergence. The graphical model literature has focused on modelling complex dependence structures in a flexible way, although approximations may be reuqired to make inference tractable. Can we develop a new set of methods which blend these strengths?
There has recently been a number of publications combining kernel and graphical model techniques, including kernel hidden Markov models, kernel belief propagation, kernel Bayes rule, kernel topic models, kernel variational inference, kernel herding as Bayesian quadrature, kernel beta processes, and a connection between kernel k-means and Bayesian nonparametrics. Each of these results deals with different inference tasks, and makes use of a range of RKHS propreties. We propose this workshop so as to "connect the dots" and develop a unified toolkit to address a broad range of learning problems, to the mutual benefit of reseachers in kernels and graphical models. The goals of the workshop are thus twofold: first, to provide an accessible review and synthesis of recent results combining graphical models and kernels. Second, to provide a discussion forum for open problems and technical challenges.
Workshop homepage: https://sites.google.com/site/kernelgraphical/
Event sectionsHome Multi-Trade-offs in Machine Learning Discrete Optimization in Machine Learning xLiTe: Cross-Lingual Technologies Modern Nonparametric Methods in Machine Learning Optimization for Machine Learning Probabilistic Numerics
Write your own review or comment: