Probability and Mathematical Needs
published: Aug. 5, 2010, recorded: July 2010, views: 7569
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
This lectures covers basics in linear algebra and probabilities as well as a brief introduction to optimization. In linear algebra, the lecture starts with the definition of vectors spaces, dimension, basis, span of vectors and so forth. Norms and dot products as well as Hilbert spaces are introduced. Then the problem of solving linear system is tackled, introducing matrices, eigenvalues... and some common factorization (SVD, LU, Choleski, QR). In probabilities, we start from the definition of discrete and continuous random variables, give common examples, introduce the concepts of independence and conditional probabilities. We tackle estimation through Bayes framework, give the basic definitions in information theory (entropy, Kullback-Leibler divergence) and introduce error bounds (Hoeffding bounds). Optimization is briefly introduced defining extrema and convex functions. An example of constrained minimization is demonstrated.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !