Sparse Algorithms are Not Stable: A No-free-lunch Theorem
author: Huan Xu,
Department of Mechanical Engineering, National University of Singapore
published: Jan. 16, 2013, recorded: December 2012, views: 3842
published: Jan. 16, 2013, recorded: December 2012, views: 3842
Slides
Related content
Report a problem or upload files
If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Description
We consider two widely used notions in machine learning, namely: sparsity and stability. Both notions are deemed desirable, and are believed to lead to good generalization ability. We show that these two notions contradict each other: a sparse algorithm can not be stable and vice versa. Thus, one has to tradeoff sparsity and stability in designing a learning algorithm. This implies that, in contrast to \ell_2 regularized regression, \ell_1 regularized regression (Lasso) cannot be stable.
Link this page
Would you like to put a link to this lecture on your homepage?Go ahead! Copy the HTML snippet !
Write your own review or comment: