Dirichlet Processes and Nonparametric Bayesian Modelling
published: Feb. 25, 2007, recorded: February 2006, views: 5247
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Bayesian modeling is a principled approach to updating the degree of belief in a hypothesis given prior knowledge and given available evidence. Both prior knowledge and evidence are combined using Bayes' rule to obtain the a posterior hypothesis. In most cases of interest to machine learning, the prior knowledge is formulated as a prior distribution over parameters and the evidence corresponds to the observed data. By applying Bayes' formula we can perform inference about new data. Having observed sufficient data, the a posteriori parameter distribution is increasingly concentrated and the influence of the prior distribution diminishes. Under some assumptions (in particular that the likelihood model is correct and that the true parameters have positive a priori probability), the a posteriori distribution converges to a point distribution located at the true parameters. The challenges in Bayesian modeling are, first, to find suitable application specific statistical models and, second, to (approximately) solve the resulting inference equations.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !