Probabilistic Decision-Making Under Model Uncertainty
published: Jan. 15, 2009, recorded: October 2008, views: 2287
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Partially Observable Markov Decision Processes offer a rich mathematical framework for decision-making under uncertainty. In recent years, a number of methods have been developed to optimize the choice of action, given a parametric model of the domain. In many applications, however, this model must be learned using a finite set of trajectories. When this data proves difficult or expensive to collect, it is often the case that the resulting model is poorly or imprecisely defined.
In this talk, I will present two recent results on the topic of decision-making under model uncertainty. In the first half, I will describe a method for estimating the bias and variance of the value function in terms of the statistics of the empirical transition and observation model. Such error terms can be used to meaningfully compare the value of different policies. In the second half, I will present a bayesian approach designed to simultaneously improve the model and select good actions. Performance of the two methods will be illustrated using problems drawn from the fields of robotics and medical treatment design.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !