Provable Matrix Completion using Alternating Minimization
published: Jan. 16, 2013, recorded: December 2012, views: 4186
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Alternating minimization has emerged as a popular heuristic for large-scale machine learning problems involving low-rank matrices. However, there have been few (if any) theoretical guarantees on its performance. In this work, we investigate the natural alternating minimization algorithm for the popular matrix sensing problem first formulated in [RFP07]; this problem asks for the recovery of an unknown low-rank matrix from a small number of linear measurements thereof. We show that under suitable RIP conditions, alternating minimization linearly converges to the true matrix. Our result can be extended to matrix completion from randomly sampled entries. Our analysis uses only elementary linear algebra and exploits the fact that, under RIP, alternating minimization can be viewed as a noisy version of orthogonal iteration (which is used to compute the top singular vectors of a matrix).
Download slides: nipsworkshops2012_netrapalli_minimization_01.pdf (308.4 KB)
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !