## Learning networks of stochastic differential equations

published: March 7, 2016, recorded: December 2015, views: 2567

# Slides

# Related content

# Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our**to describe your request and upload the data.**

__ticket system__*Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.*

# Description

Models based on stochastic differential equations (SDEs) play a crucial role in several domains of science and technology, ranging from chemistry to finance. In this talk I consider the problem of learning the drift coefficient of a pdimensional stochastic differential equation from a sample path of length T. I assume that the drift is parametrized by a high dimensional vector, and study the support recovery problem in the case where p is allowed to grow with T. In particular, I describe a general lower bound on the sample-complexity T by using a characterization of mutual information as time integral of conditional variance, due to Kadota, Zakai, and Ziv. For linear stochastic differential equations, the drift coefficient is parametrized by a p by p matrix which describes which degrees of freedom interact under the dynamics. In this case, I analyze an L1-regularized least-squares estimator and describe an upper bound on T that nearly matches the lower bound on specific classes of sparse matrices. I describe how this same algorithm can be used to learn non-linear SDEs and in addition show by means of a numerical experiment why one should expect the sample-complexity to be of the same order as that for linear SDEs.

# Link this page

Would you like to put a link to this lecture on your homepage?

Go ahead! Copy the HTML snippet !

## Write your own review or comment: