On the relation between Bayesian inference and certain solvable problems of stochastic control
published: Oct. 9, 2008, recorded: September 2008, views: 4636
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Optimal control for nonlinear stochastic dynamical systems requires thesolution of a nonlinear PDE, the so - called Hamilton Jacobi Bellman equation.Recently, Bert Kappen and Emanuel Todorov have shown that for certain types of cost functions, this equationcan be transformed to a linear problem which is mathematically related to a Bayesian estimation problem. This has led to novel efficient algorithms for optimal control of such systems. I will show a simple proof for this surprising result and discuss some possible implications.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !