Invited Talk: Empirical Risk Minimization with Statistics of Higher Order with Examples from Bipartite Ranking
published: Oct. 20, 2009, recorded: September 2009, views: 3694
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Statistical learning theory was mainly developed in the framework of binary classification under the assumption that observations in the training set form an i.i.d. sample. The techniques involved in order to provide statistical guarantees for state-of-the-art learning algorithms are borrowed from the theory of empirical processes. This is made possible not only because of the "i.i.d." assumption on the data but also because of the nature of the performance measures, such as classification error or margin error, which are statistics of order one. In the talk, I will discuss a variety of questions which arise in the theory when more involved criteria are considered. The problem of bipartite ranking through ROC curve optimization provides a prolific source of optimization functionals which are statistics of order strictly larger than one and several examples will be presented.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !