Using Interior Point Methods for Optimization in Training Very Large Scale Support Vector Machines
published: Aug. 26, 2009, recorded: June 2009, views: 7971
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
In this talk we shall discuss the issues of Interior Point Methods (IPMs) applied to solve optimization problems arising in the context of very large-scale Support Vector Machine (SVM) training. First, we will briefly introduce IPMs for linear and quadratic programming and comment on their advantages: (a) polynomial complexity, (b) ability to solve very large problems, (c) excellent practical behaviour (much better than that predicted by the worst-case complexity analysis), and (d) eligibility to parallelisation . We will then address specific features of optimization problems arising in SVM training, in particular, the presence of large datasets which are stored as dense matrices and make problems very well-suited to the use of IPMs. We will survey numerical techniques applicable in this context such as suitable factorization methods and we will comment on several interesting developments made over the last decade which aimed at using IPMs for SVM training. We will demonstrate that the key to success in applying IPMs is the ability to re-formulate SVM problems as separable quadratic optimization ones which then can be successfully tackled by appropriate linear algebra tools of IPMs [2,3]. Finally, we will comment on the existing challenges for IPMs in SVM training context and touch a number of research issues which still remain open. In particular, we will discuss a need of developing new algorithms which may take advantage of new multi-core architectures, challenges of problems getting larger and larger, and the use of non-linear and indefinite kernels. This is joint work with Kristian Woodsend.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !