Learning using Many Examples
published: Nov. 26, 2007, recorded: September 2007, views: 5051
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
The statistical learning theory suggests to choose large capacity models that barely avoid over-fitting the training data. In that perspective, all datasets are small. Things become more complicated when one considers the computational cost of processing large datasets. Computationally challenging training sets appear when one want to emulate intelligence: biological brains learn quite efficiently from the continuous streams of perceptual data generated by our six senses, using limited amounts of sugar as a source of power. Computationally challenging training sets also appear when one want to analyze the masses of data that describe the life of our computerized society. The more data we understand, the more we enjoy competitive advantages. – The first part of the tutorial clarifies the relation between the statistical efficiency, the design of learning algorithms and their computational cost. – The second part makes a detailed exploration of specific learning algorithms and of their implementation, with both simple and complex examples. – The third part considers algorithms that learn with a single pass over the data. Certain algorithms have optimal properties but are often too costly. Workarounds are discussed. – Finally, the fourth part shows how active example selection provides greater speed and reduces the feedback pressure that constrain parallel implementations.
Download slides: mmdss07_bottou_lume_01.pdf (389.1 KB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !
Write your own review or comment: