Font Size: a A A

The Convergence Of Spare Quasi-Newton Method For Partially Separable Optimization Problems

Posted on:2013-02-02Degree:MasterType:Thesis
Country:ChinaCandidate:H LuoFull Text:PDF
GTID:2230330374490914Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Quasi-Newton methods form a very important class of iterative methods for solv-ing optimization problems. The major advantages of this class of iterative methodsinclude: the global and superlinear convergence under appropriate conditions; and theno need to compute the second order derive of the objective function. Due to theirattractive properties, the quasi-Newton methods have become welcome methods in thesolution of small and middle sides optimization problems. On the other hand, however,the quasi-Newton matrices generated by a quasi-Newton method are generally dense.As a result, the standard quasi-Newton method could not be used to solve large scaleproblems. In order to make quasi-Newton methods be able to solve large scale opti-mization problems, recently, there have developed many kinds of sparse quasi-Newtonmethods. Those sparse quasi-Newton methods can generate quasi-Newton matricesthat retains the same sparse structure as the Hessian matrix of the objective function.Consequently, they can be used to solve large scale problems.So far, the research in sparse quasi-Newton methods focuses on the local conver-gence of the methods. Seldom is concerned with the global convergence of the methods.In this thesis, we will study a class of sparse quasi-Newton method called partially sep-arable quasi-Newton methods. We pay particular attention to the well-known BFGSand DFP type partially separable quasi-Newton methods. Both methods can preservethe sparse structure of the Hessian of the objective function. In addition, each methodcan generate descent direction for the objective function. We will show that underappropriate conditions, the partially separable DFP method is locally superlinearlyconvergent. By the use of the so-called cautious BFGS update, we propose a partiallyseparable cautious BFGS (CBFGS) method and establish its local superlinear conver-gence. We will also show that the method with Wolfe line search or some non-monotoneline search is globally and superlinearly convergent even for nonconvex minimizations.Moreover, after finitely many iterations, the unit steplength will always be accepted.At last, we do some numerical experiments to test the methods. The results show thatthe partially separable CBFGS method works much better than the ordinary BFGSmethod does, in particular for large scale problems.
Keywords/Search Tags:partially separable optimization problems, sparse quasi-Newtonmethods, partitioned BFGS method, global convergence, superlinear convergence
PDF Full Text Request
Related items