Font Size: a A A

Research And Application Of Conjugate Gradient Method

Posted on:2024-03-06Degree:MasterType:Thesis
Country:ChinaCandidate:J H YeFull Text:PDF
GTID:2568307091988019Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
In this paper,two kinds of optimization problems are studied,and the models of these two kinds of optimization problems are minimization objective functions.The objective function of the first kind of problem is a continuously differentiable function,and its gradient is easy to calculate.The objective function of the second kind of problem is the sum of smooth function and non-smooth function.For the first kind of problems,in the third chapter of this paper,two-term descent Polak-Ribierè-Polyak(PRP)method and the three-term descent PRP method are convexly combined,and a class of declining PRP methods is proposed,which are simplified to the two-term de-scent PRP method and the three-term descent PRP method respectively when the parameters take specific values.In this paper,it is proved that the algorithm does not rely on line search and has sufficient descent property.And under appropriate conditions,it is proved that the new algorithm with Armijo-type line search is globally convergent.The large scale unconstrained optimization problem is solved in numerical experiments,and the efficiency of the new algo-rithm is proved by comparing the experimental results.In the fourth chapter of this paper,a kind of conjugate gradient method is constructed,which can be regarded as a quasi-Newton method with a symmetric and positive definite matrix.A good property of this kind of method is that the condition number of its search direction matrix is minimum,and the direction generated by this algorithm is always a descent direction for the objective function.In particular,this paper mainly focus on the symmetric PRP method.When using exact line search,symmetric PRP method reduces to the standard PRP method.In this paper,it is proved that the symmetric PRP method with backtracking line search or the strong Wolfe type line search is globally convergent under suitable conditions.The large scale unconstrained optimization problem and image restoration problem are tested in the experiment,and the comparative numerical experiment proves that the algorithm has good competitiveness.For the second kind of problems,the fifth chapter of this paper mainly studies the commonl2-l1problems in compressed sensing,signal and image processing,and proposes an active set conjugate gradient method based on the active set identification technique and the two-term descent PRP method.The active set identification technique has the powerful ability to accurately identify the zero components near the optimal solution,so the free variables and the active variables are distinguished by the active set identification technique in each iteration?Then,the two-term descent PRP method is used to update the free variables.At the same time,dk=-xkand the gradient-based method are used to update the active variables.It is proved that the algorithm has global convergence under appropriate conditions.Thel2-l1problem is solved in numerical experiments,and the rationality and efficiency of the new algorithm are proved by comparing the experimental results.
Keywords/Search Tags:Unconstrained Optimization, PRP Method, Line Search, Active Set, Compressed Sensing
PDF Full Text Request
Related items