Convex Relaxation and Estimation of High-Dimensional Matrices thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

Convex Relaxation and Estimation of High-Dimensional Matrices

Published on May 06, 20116178 Views

Problems that require estimating high-dimensional matrices from noisy observations arise frequently in statistics and machine learning. Examples include dimensionality reduction methods (e.g., princi

Related categories

Chapter list

Convex relaxation and high-dimensional matrices00:00
Introduction00:16
Question01:58
Modern viewpoint02:36
(Nearly) low-rank matrices04:06
Example: Multiview imaging05:25
Low-rank multitask regression06:40
Example: Collaborative filtering08:28
Security and robustness issues (1)11:13
Security and robustness issues (2)11:46
Example: Matrix decomposition13:02
Example: Learning graphical models13:59
Example: Constrained system identification (1)14:02
Example: Constrained system identification (2)14:04
Remainder of talk14:06
Matrix regression problems (1)15:54
Matrix regression problems (2)17:44
Noisy matrix completion (unrescaled)20:53
Strong convexity never holds26:01
Restricted strong convexity (RSC) (1)27:40
Restricted strong convexity (RSC) (2)29:06
General guarantee for regression with nuclear norm (1)31:04
General guarantee for regression with nuclear norm (2)32:36
General guarantee for regression with nuclear norm (3)33:39
Example: Matrix completion37:41
A milder “spikiness” condition (1)40:40
A milder “spikiness” condition (2)40:59
Noisy matrix completion (general lq-balls)41:33
Other work for exactly low rank matrices (1)44:18
Other work for exactly low rank matrices (2)44:29
Other work for exactly low rank matrices (3)45:04
Example: Noisy matrix decomposition (1)45:38
Summary46:36