Font Size: a A A

Least-Squares-Based Three-Term Conjugate Gradient Methods For Solving Two Kinds Of Problems

Posted on:2021-04-19Degree:MasterType:Thesis
Country:ChinaCandidate:J QinFull Text:PDF
GTID:2370330611981449Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Mathematical optimization is a very important branch of operational research and cybernetics.Its core content is to study optimization theory and numerical algorithms.In the past half century,mathematical optimization has been widely used in industrial design,transportation,military defense,economic planning and other practical fields.In recent years,with the rise of big data,artificial intelligence,machine learning,etc.,mathematical optimization plays an increasingly important role.Conjugate gradient method is an important numerical method of mathematical optimization,which has simple algorithm structure,low storage and good convergence,thus is especially suitable for solving large-scale optimization problems.The traditional conjugate gradient methods focus on solving smooth unconstrained optimization problems,and the research in this field has achieved rich results.In this dissertation,the least-squares-based three-term conjugate gradient methods for solving large scale nonlinear equations and nonsmooth optimization problems are studied.Firstly,based on the least-squares approximation technique,a new three-term conjugate gradient method for solving large scale nonlinear equations is proposed.This method uses the least-squares technique to construct a new search direction,which aims to well combine the advantages of the existing three-term conjugate gradient methods,and thus to improve the computationalefficiency.The search direction generated by the algorithm has sufficient descent property,which is independent of any line search condition.Under proper assumptions,the global convergence and the rate of linear convergence of the algorithm are proved.Preliminary numerical results show that the proposed method is stable and effective for solving large scale nonlinear equations.Secondly,two least-squares-based three-term conjugate gradient methods are proposed for solving nonsmooth unconstrained convex optimization problems.The Moreau-Yosida regularization technique is used to transform the nonsmooth convex optimization problem into a smooth problem.By using the approximate gradients of the smooth problem and the idea of least-squares approximation,a new three-item conjugate gradient direction is constructed,which is not dependent on the line search.Two different line searches are performed to generate the step-sizes and therefore two corresponding algorithms are proposed.Under appropriate conditions,both the two algorithms have global convergence.Preliminary numerical experiments also show the effectiveness of the algorithms.Finally,the dissertation is summarized and prospected,and the further research work is pointed out.
Keywords/Search Tags:nonlinear equations, nonsmooth optimization, conjugate gradient method, least-squares, global convergence
PDF Full Text Request
Related items