|
Gradient Descent for Symmetric TensorDecomposition |
Jian-Feng Cai,Haixia Liu,Yang Wang |
(Department of Mathematics, The Hong Kong University of Science and
Technology, Clear Water Bay, Kowloon, Hong Kong, China;School of Mathematics and Statistics & Hubei Key Laboratory of
Engineering Modeling and Scientific Computing, Huazhong University of
Science and Technology, Wuhan, Hubei 430074, China) |
DOI: |
Abstract: |
Symmetric tensor decomposition is of great importance in applications. Several studies have employed a greedy approach, where the main idea is
to first find a best rank-one approximation of a given tensor, and then repeat
the process to the residual tensor by subtracting the rank-one component. In
this paper, we focus on finding a best rank-one approximation of a given orthogonally order-3 symmetric tensor. We give a geometric landscape analysis
of a nonconvex optimization for the best rank-one approximation of orthogonally symmetric tensors. We show that any local minimizer must be a factor in
this orthogonally symmetric tensor decomposition, and any other critical points
are linear combinations of the factors. Then, we propose a gradient descent
algorithm with a carefully designed initialization to solve this nonconvex optimization problem, and we prove that the algorithm converges to the global
minimum with high probability for orthogonal decomposable tensors. This result, combined with the landscape analysis, reveals that the greedy algorithm
will get the tensor CP low-rank decomposition. Numerical results are provided
to verify our theoretical results. |
Key words: Gradient descent, random initialization, symmetric tensor decomposition,
CP decomposition, linear convergence. |