Provable Matrix Completion using Alternating Minimization

author: Praneeth Netrapalli, Department of Electrical and Computer Engineering, University of Texas at Austin
published: Jan. 16, 2013,   recorded: December 2012,   views: 4190


Related Open Educational Resources

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.


Alternating minimization has emerged as a popular heuristic for large-scale machine learning problems involving low-rank matrices. However, there have been few (if any) theoretical guarantees on its performance. In this work, we investigate the natural alternating minimization algorithm for the popular matrix sensing problem first formulated in [RFP07]; this problem asks for the recovery of an unknown low-rank matrix from a small number of linear measurements thereof. We show that under suitable RIP conditions, alternating minimization linearly converges to the true matrix. Our result can be extended to matrix completion from randomly sampled entries. Our analysis uses only elementary linear algebra and exploits the fact that, under RIP, alternating minimization can be viewed as a noisy version of orthogonal iteration (which is used to compute the top singular vectors of a matrix).

See Also:

Download slides icon Download slides: nipsworkshops2012_netrapalli_minimization_01.pdf (308.4┬áKB)

Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: