Robust Near-Separable Nonnegative Matrix Factorization Using Linear Optimization
published: Aug. 26, 2013, recorded: July 2013, views: 5557
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Nonnegative matrix factorization (NMF) has been shown recently to be tractable under the separability assumption, which amounts for the columns of the input data matrix to belong to the convex cone generated by a small number of columns. Bittorf, Recht, R´e and Tropp (‘Factoring nonnegative matrices with linear programs’, NIPS 2012) proposed a linear programming (LP) model, referred to as HottTopixx, which is robust under any small perturbation of the input matrix. However, HottTopixx has two important drawbacks: (i) the input matrix has to be normalized, and (ii) the factorization rank has to be known in advance. In this talk, we generalize HottTopixx in order to resolve these two drawbacks, that is, we propose a new LP model which does not require normalization and detects the factorization rank automatically. Moreover, the new LP model is more ﬂexible, signiﬁcantly more tolerant to noise, and can easily be adapted to handle outliers and other noise models. We show on several synthetic datasets that it outperforms HottTopixx while competing favorably with two state-of-the-art methods.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !