Inference in Graphical Models
published: March 12, 2008, recorded: March 2008, views: 15037
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Watch videos: (click on thumbnail to launch)
This short course will cover the basics of inference in graphical models. It will start by explaining the theory of probabilistic graphical models, including concepts of conditional independence and factorisation and how they arise in both Markov random fields and Bayesian Networks. He will then present the fundamental methods for performing exact probabilistic inference in such models, which include algorithms like variable elimination, belief propagation and Junction Trees. He will also briefly discuss some of the current methods for performing approximate inference when exact inference is not feasible. Finally, he will illustrate a range of real problems whose solutions can be formulated as inference in graphical models.
Download slides: mlss08au_caetano_grmo.pdf (3.2 MB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !
Reviews and comments:
It's certainly a good introduction to the basics of GM, spends enough time to express the basic notions and is great specially for someone with little background on the topics. The downside might be that it's still a little "slow", even for a novice in this field.
Write your own review or comment: