published: Aug. 23, 2016, recorded: August 2016, views: 51693
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
We provide a general introduction to machine learning, aimed to put all participants on the same page in terms of definitions and basic background. After a brief overview of different machine learning problems, we discuss linear regression, its objective function and closed-form solution. We discuss the bias-variance trade-off and the issue of overfitting (and the proper use of cross-validation to measure performance objectively). We discuss the probabilistic view of the sum-squared error as maximizing likelihood under specific assumptions on the data generation process, and present L2 and L1 regularization methods as priors from a Bayesian perspective. We briefly discuss Bayesian methodology for learning. Finally, we present logistic regression, the cross-entropy optimization criterion and its solution through first- and second-order methods.
Download slides: deeplearning2016_precup_machine_learning_01.pdf (2.0 MB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !