## Tensor principal component analysis

published: Aug. 20, 2015, recorded: July 2015, views: 1618

# Slides

# Related content

# Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our**to describe your request and upload the data.**

__ticket system__*Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.*

# Description

We study a statistical model for the \emph{tensor principal component analysis problem} introduced by Montanari and Richard: Given a order-$3$ tensor $\bT$ of the form $\bT = \tau \cdot v_0^{\tensor 3} + \bA$, where $\tau \geq 0$ is a signal-to-noise ratio, $v_0$ is a unit vector, and $\bA$ is a random noise tensor, the goal is to recover the planted vector $v_0$. For the case that $\bA$ has iid standard Gaussian entries, we give an efficient algorithm to recover $v_0$ whenever $\tau \gg n^{3/4} \log(n)^{1/4}$, and certify that the recovered vector is close to a maximum likelihood estimator, all with high probability over the random choice of $\bA$. The previous best algorithms with provable guarantees required $\tau \geq \Omega(n)$. In the regime $\tau \ll n$, natural tensor-unfolding-based spectral relaxations for the underlying optimization problem break down. To go beyond this barrier, we use convex relaxations based on the sum-of-squares method. Indeed our recovery algorithm proceeds by rounding a degree-$4$ sum-of-squares relaxations of the maximum-likelihood-estimation problem for the statistical model. To complement our algorithmic results, we show that degree-$4$ sum-of-squares relaxations break down for $\tau \ll n^{3/4}$, which demonstrates that improving our current guarantees (by more than logarithmic factors) would require new techniques or might even be intractable. Finally, we show how to exploit additional problem structure in order to solve our sum-of-squares relaxations, up to some approximation, very efficiently. Our fastest algorithm runs in nearly-linear time using shifted (matrix) power iteration and has similar guarantees as above. The analysis of this algorithm also confirms a conjecture of Montanari and Richard about singular vectors of tensor unfoldings.

# Link this page

Would you like to put a link to this lecture on your homepage?

Go ahead! Copy the HTML snippet !

## Write your own review or comment: