Font Size: a A A

On Improvement Of Some First Order Methods And Their Applications

Posted on:2021-12-23Degree:DoctorType:Dissertation
Country:ChinaCandidate:Tsegay Giday WolduFull Text:PDF
GTID:1480306470965829Subject:Mathematics
Abstract/Summary:PDF Full Text Request
This Ph D dissertation focuses on improvements of some first order methods and their utilities in large scale optimization problems.The research is motivated by the wide range application of large scale optimization such that data science,logistics,production modeling,design of complex system,machine learning.Despite the large scale problems are widely employed in different research areas,they have computational and storage cost compared to small-sized and medium-sized problems.In consequence,the demand of introducing efficient algorithms is the reasonable concern of many researchers,and the first order algorithms are among the most widely employed iterative methods to solve optimization problems.These methods mainly rely on the information that can obtained from the first order derivatives.Due to their clarity and less memory demands,they are considered relatively suitable and efficient.In this dissertation,we attempt to introduce some modified first order line search methods to solve large scale optimization problems.The line search methods are among the most popular iterative methods that alter the search direction to find the new descent direction of an objective function by computing the step length,which guarantees the fast reduction of the objective function along the new direction.There are a great number of line search methods for numerical optimization,and in this research,we mainly concerned on improvements of some first order line search methods.In particular,the research is typically focused on modification of the nonlinear conjugate gradient methods and the BFGS methods.These two methods are among the most popular first order line search algorithms with some tolerable features.The conjugate gradient method is one of the most worthwhile first order approach,which originally introduced to solve systems of linear equations,and thereafter reconstructed to solve large scale nonlinear unconstrained optimization problems.The main qualities of the nonlinear conjugate gradient methods are that they do not need the computation of Hessian matrix of objective function,and their convergence behavior is faster than the steepest descent method.Due to their marvelous conjugacy property,they are very crucial when the dimension of a problem increases.The simplicity of iteration and low memory requirements make the nonlinear conjugate gradient methods are more desirable as the size of the problem increases.However,there are still some aspects that diminish the global convergence property and numerical efficiency of nonlinear conjugate gradient methods.Some groups of nonlinear conjugate gradient methods have good convergence properties,but they begin to take small steps without making significant progress to the minimum.In contrast to the first group,another class of the conjugate gradient methods may fail to converge for general nonlinear functions while their numerical performance is tolerable.In addition to this,the utility of nonlinear conjugate methods is mainly restricted for solving only smooth optimization problems.Therefore,the nonlinear conjugate methods that attain the strong globally convergence and numerical effective for solving wide range of large scale optimization are still demanded.The intention of developing a nonlinear conjugate gradient methods that retain both the convergence property and numerical efficiency is tolerable.Since the strength of nonlinear conjugate gradient methods relies on the nature of their search direction and line search techniques,we motivated to modify the search direction of nonlinear conjugate methods and their line search techniques.In this research,the novel sufficient descent conjugate method is introduced to solve large scale general nonlinear smooth unconstrained optimization,the global convergence of the proposed method is established under mild assumptions.Furthermore,a large number of preliminary numerical experiments are implemented to test the numerical efficiency of the new approach,and the numerical results show that the novel method is promising.Besides,by considering some additional alteration on the formula of the introduced conjugate direction,the new modified conjugate gradient method with the search direction that possesses the sufficient descent property and belongs to the trust region is provided.Moreover,by incorporating the developed method with the well-known projection scheme,we extended the utility of the method into large scale nonlinear monotone equations with convex constraints.The global convergence property of the method is established under some suitable conditions,and the preliminary numerical results also show the numerical efficiency and robustness of the new algorithm.In addition to this,a nonlinear conjugate gradient method that globally converges for nonsmooth convex unconstrained optimization problems is introduced.The effectiveness of the method is demonstrated by numerical experiments over some nonsmooth academic test problems.The other notion of this dissertation is to improve the utility of BFGS method.This method builds up the curvature information of the objective function at each iteration by approximating the Hessian matrix.The explicit Hessian computation could be an expensive,and the BFGS methods are among the most popular algorithms with implicit Hessian computation.Under some special inexact line search techniques,the BFGS method converges superlinearly for unconstrained convex optimization problems.However,because of the storage requirements of dense Hessian approximation,the efficiency of BFGS method is mainly restricted in small-sized and medium-sized optimization problems.The limited memory BFGS(L-BFGS)is an altered BFGS method that can deal with large-scale unconstrained problems.This method stores a few vectors rather than the whole dense matrices.However,this method does not converge rapidly and they are less likely efficient on highly ill-conditioned problems.The convergence rate of this method is linear,and alike the nonlinear conjugate gradient methods,the utility of L-BFGS methods is also typically limited on solving smooth problems.Therefore,some reasonable improvements of L-BFGS method are also motivated in this research.In order to make use of the curvature information of the L-BFGS method and the conjugacy properties of the nonlinear conjugate gradient method,we proposed the hybrid approach that incorporates the new modified secant equation of L-BFGS method with the new scaled conjugate direction.The new proposed method strongly converges for nonsmooth convex problems under the modified nonmonotone line search.The numerical experiments show the computational efficiency of the new algorithm.Furthermore,some remarkable strategies are introduced to improve the utility of LBFGS method,and we have attempted to develop the novel proximal L-BFGS method that can deal with nonsmooth nonconvex composite optimization problems.Under some conditions,we have proven that the search direction of the new proposed algorithm converges into the stationary point of the nonsmooth composite objective function.
Keywords/Search Tags:Conjugate Gradient Method, Limited Memory BFGS Method, Proximal Approaches, Monotone Nonlinear System of Equations, Nonsmooth Optimization
PDF Full Text Request
Related items