Font Size: a A A

Theory And Algorithms Of Constrained Matrix Inequality And Its Least Squares Problem

Posted on:2018-03-14Degree:MasterType:Thesis
Country:ChinaCandidate:Q ZhouFull Text:PDF
GTID:2310330542459806Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Linear constrained matrix inequality and its least squares problem has been an important topic in the field of numerical algebra,which has important application in many fields such as image recovery,control theory,combinatorial optimization.Furthermore,in many practical applications,the constrained matrix is not only need to be symmetric,but also required to be nonnegative.In this dissertation,the following two kinds of constrained matrix inequality and its least squares problems have been studied:Problem one:Given matrices A? Rm×n,B?Rm×n,C? Rp×n,D?Rn×q and E ? SRp×q,find matrix X ? BSRn×n such that AX = B,CXD ? E.Problem two:Given matrices A ? Rm×n,B ? Rm×n,C ? Rp×n,D ?Rn×q and E ? Rp×q,find matrix X? BSRn×n,and X ? 0 such that AX = B,CXD ? E.For the first problem,the equivalent matrix inequality smallest nonnegative deviation problem is considered in this dissertation.Based on the existing algorith-m,a modified iteration method is proposed to solve this problem.In this iteration process,the whole required computation has been reduced by implementing the matrix form LSQR method to solve a constrained least squares subproblem over a Krylov subspace of low dimension in finite steps.Related numerical experiments are given.Furthermore,the gradient method is introduced to solve this smallest nonnegative deviation problem.Firstly,we show that the corresponding objec-tive function is continuously differentiable and its gradient is Lipschitz continuous.Then,inspired by the fast iterative shrinkage-thresholding algorithm,and the com-bination of the projection theorem and related numerical optimization theories,an accelerated gradient method,which shares the attractive O(1/k2)iteration com-plexity,is presented.Furthermore,related numerical experiments on randomly generated data are also presented to demonstrate the efficiency of the proposed method.Our numerical results indicate that the proposed accelerated gradient method compares favorably with the modified iteration method.For the second one,based on the quadratic penalty function approach,we transform this problem into a unconstrained convex optimization problem.Rely on the properties of bisymmetric matrix,KKT conditions and the fast iterative shrinkage-thresholding algorithm,a fast algorithm is designed to solve this problem.Finally,the global convergence of the proposed method is proved and preliminary numerical experiments are also presented to verify the effectiveness of this method.
Keywords/Search Tags:Matrix inequality, least squares problem, matrix Krylov subspace, gradient method, fast iterative shrinkage-thresholding algorithm, KKT conditions
PDF Full Text Request
Related items