About
Deterministic (variational) techniques are used all over Machine Learning to approximate Bayesian inference for continuous- and hybrid-variable problems. In contrast to discrete variable approximations, surprisingly little is known about convergence, quality of approximation, numerical stability, specific biases, and differential strengths and weaknesses of known methods.
In this workshop, we aim to highlight important problems and to gather ideas of how to address them. The target audience are practitioners, providing insight into and analysis of problems with certain methods or comparative studies of several methods, as well as theoreticians interested in characterizing the hardness of continuous distributions or proving relevant properties of an established method. We especially welcome contributions from Statistics (Markov Chain Monte Carlo), Information Geometry, Optimal Filtering, or other related fields if they make an effort of bridging the gap towards variational techniques.
Videos

Approximation and Inference using Latent Variable Sparse Linear Models
Feb 1, 2008
·
4428 views

Perturbative Corrections to Expectation Consistent Approximate Inference
Dec 31, 2007
·
3717 views

A Completed Information Projection Interpretation of Expectation Propagation
Dec 31, 2007
·
4432 views

Improving on Expectation Propagation
Dec 31, 2007
·
4241 views

Introduction to the Workshop
Dec 31, 2007
·
3598 views

Bounds on the Bethe Free Energy for Gaussian Networks
Feb 1, 2008
·
4297 views

Infer.NET - Practical Implementation Issues and a Comparison of Approximation Te...
Dec 31, 2007
·
10051 views

Approximating the Partition Function by Deleting and then Correcting for Model E...
Dec 31, 2007
·
3262 views

Variational Optimisation by Marginal Matching
Dec 31, 2007
·
3777 views

Message-Passing Algorithms for GMRFs and Non-Linear Optimization
Feb 1, 2008
·
4060 views

Large-scale Bayesian Inference for Collaborative Filtering
Dec 31, 2007
·
9688 views