Path integral control from probabilistic viewpoint
published: Oct. 16, 2012, recorded: September 2012, views: 4217
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
We show that stochastic control problems with a particular cost structure involving a relative entropy term admit a purely probabilistic solution, without the necessity of applying the dynamic programming principle. The argument is as follows. Minimization of the expectation of a random variable with respect to the underlying probability measure, penalized by relative entropy, may be solved exactly. In the case where the randomness is generated by a standard Brownian motion, this exact solution can be written as a Girsanov density. The stochastic process appearing in the Girsanov exponent has the role of control process, and the relative entropy of the change of probability measure is equal to the integral of the square of this process. An explicit expression for the control process may be obtained in terms of the Malliavin derivative of the density process. The theory is applied to the problem of minimizing the maximum of a Brownian motion (penalized by the relative entropy), leading to an explicit expression of the optimal control law in this case. The theory is then applied to a stochastic process with jumps, illustrating the generality of the method. The link to linearization of the Hamilton-Jacobi-Bellman equation is made for the case of diffusion processes.
Download slides: cyberstat2012_bierkens_probabilistic_viewpoint_01.pdf (729.5 KB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !