Font Size: a A A

Tensor Low Rank Approximation And Gradient Flow

Posted on:2016-05-03Degree:DoctorType:Dissertation
Country:ChinaCandidate:L Q WangFull Text:PDF
GTID:1220330461477717Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
1. Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the notable exception that the best rank-one approximation always exists for tensors of any order. Toward the latter, the most popular approach is the notion of alternating least squares whose specific numerical scheme appears in the form as a variant of the power method. Though the limiting behavior of the objective values is well understood, a proof of global convergence for the iterates themselves has been elusive. Chapter 2 partially addresses the missing piece by showing that for almost all tensors, the iterates generated by the alternating least squares method for the rank-one approximation converge globally. The underlying technique employed is an eclectic mix of knowledge from algebraic geometry and dynamical system.2. With the notable exceptions of two cases -that tensors of order 2, namely, matrices, al-ways have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption that only one set of components in the decomposed rank-one tensors is required to be mutually perpendicular, the situation is changed completely -orthogonal low rank approx-imations always exist. The purpose of Chapter 3 is to discuss the best low rank approximation subject to orthogonality. The conventional high-order power method is modified to address the orthogonality via the polar decomposition. Algebraic geometry technique is employed to show that for almost all tensors the orthogonal alternating least squares method converges globally.3. Solving a system of linear equations by its normal equation usually is highly unrecom-mended because this approach worsens the condition number and inflates the computational cost. For linear systems whose unknowns are matrices, such as the Sylvester equation, Lyapunov equa-tion, Stein equation, and a variate of their generalizations, the formulation of the corresponding normal equation in the sense of tensor operators offers a common structure via gradient dynamic-s. Chapter 4 explains the setting of this framework and demonstrates its versatility by one simple ODE integrator that can handle almost all these types of problems.
Keywords/Search Tags:tensor decomposition, linear matrix equation, gradient flow, ALS method, tensor rank-1 approximation, tensor low rank approximation
PDF Full Text Request
Related items