On Accelerating the Regularized Alternating Least Square Algorithm for Tensors

Academic Article

Abstract

  • In this paper, we discuss the acceleration of the regularized alternating least square (RALS) algorithm for tensor approximation. We propose a fast iterative method using a Aitken-Stefensen like updates for the regularized algorithm. Through numerical experiments, the fast algorithm demonstrate a faster convergence rate for the accelerated version in comparison to both the standard and regularized alternating least squares algorithms. In addition, we analyze the global convergence based on the Kurdyka- Lojasiewicz inequality as well as show that the RALS algorithm has a linear local convergence rate.
  • Keywords

  • math.NA, math.NA
  • Author List

  • Wang X; Navasca C; Kindermann S