Font Size: a A A

Design And Convergence Analysis For Several New Gradient-type Methods

Posted on:2020-05-14Degree:DoctorType:Dissertation
Country:ChinaCandidate:X Y ZhangFull Text:PDF
GTID:1480306548491414Subject:Mathematics
Abstract/Summary:PDF Full Text Request
With the rapid development of science and technology,the optimization problems involved in various fields has become increasingly huge.As a general method for solv-ing optimization problems,the gradient-type method has been widely used due to its low complexity and sound theoretical basis.Researchs on new gradient-type methods has the-oretical significance and practical prospects.On the one hand,the pursuit for efficient al-gorithms solving different practical problems requires us to design efficient gradient-type methods;on the other hand,we would encounter the challenge of theoretical guarantees by using the new gradient-type algorithms,which push us explore new mathematical con-cepts,develop new proof tools,and propose new proof methods.Designing and analyzing the new gradient-type methods based on the structural feature of the optimization problem would greatly enrich the theoretical research,and provide new solutions for solving the optimization problems in various application fields.This article contains algorithm framework design,theoretical analysis and numerical experiments.The following are the main points and innovations of this article:1.For minimizing the sum of smooth convex component functions and a pos-sibly nonsmooth convex regularization function,we propose an inertial version of the proximal incremental aggregated gradient method.We analyzed the linear con-vergence of the objective function value and the iterative point generated by the i PIAG algorithm under the assumption of gradient Lipschitz continuity and strong convexity.Secondly,when the strong convexity is weakened by quadratic growth condition,another proof method based on Lyapunov function is used to prove the linear convergence.Fi-nally,we present two numerical tests to illustrate that i PIAG outperforms the original PIAG.2.For mimimizing a class of nonconvex nonsmooth optimization problems with-out the gradient Lipschitz continuity,a Bregman proximal gradient method with extrapolation is designed.This article introduces an extrapolation method to acceler-ate the Bregman proximal gradient algorithm.Firstly,under the general assumptions,we prove that any limit point of the sequence generated by BPGe is a stationary point of the problem by choosing the parameters properly.Besides,assuming Kurdyka-(?)ojasiewicz property,we prove that all the sequences generated by BPGe converge to a stationary point.Finally,to illustrate the potential of the new method BPGe,we apply it to two important practical problems that arise in many fundamental applications:Poisson linear inverse problems and quadratic inverse problems.3.For minimizing a class of nonconvex nonsmooth optimization problems with adjoint operator,two alternating minimization methods are designed.The first one is a nonconvex proximal alternating minimization method.By introducing a new auxiliary variable,the original problem is split into two simpler subproblems,and each subproblem is solved alternately by using proximal point method.Theoretically,when the Kurdyka-(?)ojasiewicz property is satisfied,we prove the whole sequence generated by the algorithm converges to the stationary point.The second one is Bremgan primal dual method.By introducing a dual auxiliary variable,the original problem is transformed into a saddle point problem,and then the Bregman distance is introduced to replace the quadratic dis-tance in the classical primal dual method.The convergence of this algorithm is analyzed.Finally,the effectiveness of the non-convex proximal alternating minimization method is verified by thel0minimization problem and the validity of Bremgan primal dual method is verified by the Poisson denoising problem.4.We establish a new exact worst-case linear convergence rate of the proximal gradient method in terms of the proximal gradient norm,which complements the recent results and implies a refined descent lemma.The proximal gradient algorithm is a classical method for solving nonsmooth convex optimization problems and the theo-retical research is rich.Firstly,under the assumption of gradient Lipschitz continuity and strong convexity,a new exact worst-case linear convergence rate of the proximal gradient method in terms of the proximal gradient norm is established,which complements the ex-isting theoretical results.Secondly,the existing descent lemma is improved.Based on the new lemma,we improve the linear convergence rate of the objective function accuracy under the Polyak-(?)ojasiewicz inequality.
Keywords/Search Tags:Gradient descent, Proximal gradient methods, Inertial methods, Bregman distance, Nonconvex optimization
PDF Full Text Request
Related items