Font Size: a A A

Fast Algorithms For Tensor Equations With Applications

Posted on:2018-08-15Degree:MasterType:Thesis
Country:ChinaCandidate:L S LvFull Text:PDF
GTID:2310330566466502Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
With the development of science and technology,tensor has widely applications in many fields such as signal processing,image processing,nonlinear optimization,higher-order statistics,data mining and so on.Engineering and scientific computing problems can be expressed in the form of tensor-vector products,which called tensor equations.At the same time,The tensor eigenvalue complementarity problems and the limiting probability distribution of higher-order Markov chain can be converted to tensor equations.In this paper,we propose some fast and efficient algorithms for the following three kinds of tensor equations.The details are given as follows:In the second chapter,a superlinearly convergent algorithm is proposed for tensor equations Axm-1=b.It can be regarded as a natural generalization of matrix equation Ax=b.We first reformulate the tensor equations as a least squares problem,then we propose the Gauss-Newton method to solve the least squares problem.On the other hand,our method can compute some general tensor equations.Under certain conditions,the global convergence and superlinearly convergence rate of the algorithm are established.At last,tensor equations are applied to the largest eigenvalue of non-negative tensors and the sparsest to M-tensor complementarity problems.The experimental results verify the effectiveness of Gauss-Newton.In chapter three,we study a fast optimization algorithm for tensor eigenvalue complementarity problems.Tensor eigenvalue complementarity is one of the most basic problems of tensor eigenvalues,which can be regarded as a higher-order extension of matrix eigenvalue complementarity problem.In this chapter,a smoothing Newton method is proposed for solving tensor eigenvalue complementarity problems.We first reformulate the tensor eigenvalue complementarity problem as a system of smoothing tensor equations by introducing a smoothness approximation function of the NCP function.Then using the smoothing Newton method to solve this smoothing tensor equations.Under certain conditions,the convergence of the algorithm in this chapter is also guaranteed by existing results.Numerical examples show that our algorithm is efficient and competitive to some existing algorithms.In the fourth chapter,we mainly study the limiting probability distribution of higher-order Markov chain,which can be regarded as tensor equations problems.This chapter will propose quadratic extrapolation method to solve the limiting probability distribution of higher-order Markov chain.Under some certain assumption conditions,the higher-order Markov chain can be converted into first order Markov chain.Then,a quadratic extrapolation method for the limiting probability distribution problem of a Markov chain is developed.Under some suitable conditions,the convergence of the extrapolation method can be established.Numerical results show that the proposed algorithm converges faster than the power method.
Keywords/Search Tags:Tensor equations, Tensor eigenvalue complementarity problems, Smoothing Newton method, The limiting probability distribution of higher-order Markov chain, Quadratic extrapolation method
PDF Full Text Request
Related items