About
Deterministic (variational) techniques are used all over Machine Learning to approximate Bayesian inference for continuous- and hybrid-variable problems. In contrast to discrete variable approximations, surprisingly little is known about convergence, quality of approximation, numerical stability, specific biases, and differential strengths and weaknesses of known methods.
In this workshop, we aim to highlight important problems and to gather ideas of how to address them. The target audience are practitioners, providing insight into and analysis of problems with certain methods or comparative studies of several methods, as well as theoreticians interested in characterizing the hardness of continuous distributions or proving relevant properties of an established method. We especially welcome contributions from Statistics (Markov Chain Monte Carlo), Information Geometry, Optimal Filtering, or other related fields if they make an effort of bridging the gap towards variational techniques.
Related categories
Uploaded videos:
Introduction to the Workshop
Dec 31, 2007
·
3594 Views
Infer.NET - Practical Implementation Issues and a Comparison of Approximation Te...
Dec 31, 2007
·
10035 Views
Approximating the Partition Function by Deleting and then Correcting for Model E...
Dec 31, 2007
·
3257 Views
Variational Optimisation by Marginal Matching
Dec 31, 2007
·
3774 Views
Improving on Expectation Propagation
Dec 31, 2007
·
4237 Views
Large-scale Bayesian Inference for Collaborative Filtering
Dec 31, 2007
·
9679 Views
Perturbative Corrections to Expectation Consistent Approximate Inference
Dec 31, 2007
·
3712 Views
A Completed Information Projection Interpretation of Expectation Propagation
Dec 31, 2007
·
4428 Views
Approximation and Inference using Latent Variable Sparse Linear Models
Feb 01, 2008
·
4422 Views
Message-Passing Algorithms for GMRFs and Non-Linear Optimization
Feb 01, 2008
·
4056 Views
Bounds on the Bethe Free Energy for Gaussian Networks
Feb 01, 2008
·
4294 Views