Font Size: a A A

A Class Of Nonmonotone Line Search Quasi-newton Method And Conjugate Gradient Method

Posted on:2009-07-29Degree:MasterType:Thesis
Country:ChinaCandidate:J ZhangFull Text:PDF
GTID:2190360245972329Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this paper, we study the application of a kind of nonmonotone line search's technique in quasi-Newton methods and conjugate gradient method. This nonmonotone line search is belongs to Armijo-type line searchs, its idea is from Yu-hong Dai who put it forward in 2002, in his study, he put conjugate gradient methods together and study. The nonmonotone line search in this paper reduced the condition of stop, and when the step sizeαk is being computed at every step, we produce a changing initial test step size rk, not the fixed initial test step sizeαthat first mentioned. If the initial test step is independent of gradient, the research of nonmonotone line search in this paper is a kind of line search without derivatives essentially. Its monotone line search condition was studied by Leone, Gaudioso and Grippo, but the discussion they only made was for test step against a special condition; the study of convergence we make here is independent of initial test step.The first three sections of the first chapter give us a rough description of theoretics, development and developing trends. The fourth section presents the technique of nonmonotone line search, and shows some typical form, some of them are latest results in recent years. The fifth section presents the innovative ideas of this paper.In chapter two, we studied the application of a kind of nonmonotone line search. As we all know, BFGS formula is one of the best quasi-Newton methods, this chapter get it together with a kind of nonmonotone line search without making any modification to BFGS formula, and come to the conclusion that it is global convergence. Usually, the proof of global convergence of nonmonotone linesearch needs: (1) sufficient decrease condition: gkTdk≤-c1‖gk2, (2) bounded condition:‖dk‖≤c2‖gk‖, where c1, c2>0. These two conditions are strong, and common quasi-Newton methodsis hard to satisfy it. That's the hard part of this problem. The generally hypothesis of this paper is same as proceeding [25], and a little weak compared with [69], especially when we take out the assumption which dk is sufficient decrease condition and bounded condition [63] needs. The result of numerical proves that the arithmetic is effective.Chapter three makes the study about of a kind of nonmonotone line search in the application of conjugate gradient methods. At present, most of the study of conjugate gradient methods are using Wolfe's monotone line search, by constructing the condition of Zoutendijk, we can get the conclusion that it's convergence by using reduction to absurdity. Here we study the global convergence of conjugate gradient methods with Armijo-type line searchs, the thought of proof wasn't using the method above mentioned. We study the condition of global convergence of four conjugate gradient methods with nonmonotone line searchs. With the nonmonotone line search, for nonconvex functions, we prove the global convergence of a kind of modified PRP method. When the conditions is being increased, for nonconvex functions, we prove the global convergence of modified DY method, HZ method and modified FR method. And according to the results, we compare those four kinds of arithmetic.The innovative ideas of this paper are listed below:(1) By study this kind of line search, the numerical results shows that on matter it's nonmonotone line search or monotone line search, it better than the traditional method of Grippo-Lampariello-Lucidi line search.(2) The study for nonmonotone line search is independent of the sufficient decrease condition and bounded condition.(3) With the nonmonotone line search, we prove the global convergence of BFGS method and those four conjugate gradient methods.
Keywords/Search Tags:nonmonotone line search, quasi-Newton methods, BFGS updating formula, conjugate gradient methods, unconstrained optimization, global convergence
PDF Full Text Request
Related items