Font Size: a A A

A New Steepest Descent Method

Posted on:2022-05-24Degree:MasterType:Thesis
Country:ChinaCandidate:Q S WangFull Text:PDF
GTID:2480306329989709Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Non-linear programming is a method for solving optimization problems with one or several non-linear functions in the objective function or constraint conditions,and it is an important branch of operations research.In the early 1950s,H.W.Kuhn and A.W.Tucker proposed the basic theorems of nonlinear programming,which laid a theoretical founda-tion for nonlinear programming.This method has a wide range of applications in industry,transportation,economic management and military,etc.,especially in the"optimal de-sign" aspect.It provides a mathematical foundation and calculation method,so it has important practical value.The steepest descent method studied in this paper is one of the classic methods to solve the problem of unconstrained nonlinear programming.The steepest descent method is a minimization algorithm with the negative gradient direction as the descent direction,also known as the gradient method.It was proposed by French scientist Cauchy in 1874.The iterative form of the steepest descent method is as follows xk+1=xk-αkgk,where gk is the gradient vector of the objective function f(x)at the iteration point xk,and αk>0 is the steplength.The steepest descent method is the simplest method for unconstrained optimization,so the steepest descent method has the advantages of simple program design,small calculation workload,small storage capacity,and no special require-ments for the initial point.But the steepest descent method also has many shortcomings.When we use the exact linear search to find the step size ak=arg min f(xk-αgk),the steepest descent method may converge very slowly.Numerical experiments show that when the contour of the objective function is close to a circle,the steepest descent method drops faster;when the contour of the objective function is a prolate ellipsoid,the steepest descent method starts to fall quickly in a few steps,and then it shows sawtooth phenomenon,and the decline is very slow.Therefore,increasing the convergence speed by selecting a new step size has become a research direction.In 1988,Barzilai and Borwein proposed a new step size selection methodαkBB1=arg min ‖B(α)sk-1-yk-1‖,αkBB2=arg min ‖sk-1-B(α)-1yk-1‖,this method has good performance in high latitudes,the main idea of this method is to use the information of the previous iteration to get the step length of this iteration.When the objective function f(x)is a strictly convex quadratic function,f(x)=gTx+xTHx,whereg ∈ Rn,H ∈ Rn×n symmetric and positive definite,Raydan(see[5])proved that the BB method is globally convergent.In the two-dimensional case,Barzilai and Borwein(see[4])gave the R-superlinear convergence results of this method.Their analysis shows that when the condition of matrix H has ill condition,the convergence speed becomes faster instead.For large-scale unconstrained optimization,the BB method is competi-tive,and sometimes even better than several well-known conjugate gradient algorithms.Due to the simplicity and high numerical efficiency of the BB method,it has inspired a lot of research on the steepest descent method(see[6-10]).But this method is not mono-tonic,so unless some non-monotonic techniques are applied,it is not easy to generalize it to general nonlinear functions.This paper introduces a new steepest descent method,which can obtain the optimal value in a two-dimensional situation within three iterations,which not only satisfies the improvement of the convergence rate,but also satisfies the monotonicity.This algorithm can quickly solve low-latitude problems,it also has a good performance in solving high latitude problems,provides a new idea for the development of unconstrained optimization problems.
Keywords/Search Tags:Unconstrained optimization, Non-linear programming, Steepest descent method, Line search
PDF Full Text Request
Related items