Perturbative Corrections to Expectation Consistent Approximate Inference
published: Dec. 31, 2007, recorded: December 2007, views: 144
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Algorithms for approximate inference usually come without any guarantee for the quality of the approximation. Nevertheless, we often find cases where such algorithms perform extremely well on the computation of posterior moments when compared to time consuming (and in the limit exact) MC simulations or exact enumerations.
A prominent example is the Expectation Propagation (EP) algorithm when applied to Gaussian process classification. Can we understand when and why we can trust the approximate results or, if not, how we could obtain systematic improvements?
In this talk, we rederive the fixed point conditions of EP using the ideas of expectation consistency (EC)  and explicitly consider the terms neglected in the approximation. We will show how one can derive a formal (asymptotic) power series expansion for this correction and compute its leading terms. We will illustrate the approach for the case of GP classification and for networks of Ising variables.
 Expectation Consistent Approximate Inference, Manfred Opper and Ole Winther, JMLR 6, 2177 - 2204 (2005).
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !