Fast projections onto l1,q-norm balls for grouped feature selection
published: Oct. 3, 2011, recorded: September 2011, views: 3196
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Joint sparsity is widely acknowledged as a powerful structural cue for performing feature selection in setups where variables are expected to demonstrate “grouped” behavior. Such grouped behavior is commonly modeled by Group-Lasso or Multitask Lasso-type problems, where feature selection is effected via l1,q-mixed-norms. Several particular formulations for modeling groupwise sparsity have received substantial attention in the literature; and in some cases, efficient algorithms are also available. Surprisingly, for constrained formulations of fundamental importance (e.g., regression with an l1,∞-norm constraint), highly scalable methods seem to be missing. We address this deficiency by presenting a method based on spectral projected-gradient (SPG) that can tackle l1,q- constrained convex regression problems. The most crucial component of our method is an algorithm for projecting onto l1,q-norm balls. We present several numerical results which show that our methods attain up to 30X speedups on large l1,∞-multitask lasso problems. Even more dramatic are the gains for just the l1,∞-projection subproblem: we observe almost three orders of magnitude speedups compared against the currently standard method.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !