Font Size: a A A

Sparse Quasi-Newton Methods For Partially Separable Nonlinear Equations And Optimization Problems

Posted on:2017-01-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:H P CaoFull Text:PDF
GTID:1310330512959006Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Quasi-Newton methods are quite welcome in the solution of nonlinear equations as well as optimization.An attractive property of those methods is their superlinear convergence without computation of the Jacobian or Hessian matrices.If some line search or trust region technique is used,the methods can be even globally convergent.Unfortunately,the quasi-Newton matrices generated by a quasi-Newton method are generally dense.As a result,ordinary quasi-Newton methods are not able to solve large scale problems directly.In many practical fields such as the scientific computing and engineering,the problems often possess some sparse structures.For example,the nonlinear equations arising from the discretization of many differential equations are very sparse.There are also many optimization problems in which the objective function can be written as the sum of some element functions,where each element function only depends on a few components of the vector of variables.It is then desirable to design efficient sparse quasi-Newton methods,using the special structure of the problem.Indeed,the study in the sparse quasi-Newton methods has become an important research topic in the field of optimization and has received much attention.Many sparse quasi-Newton methods have been developed for solving large scale nonlinear equations and optimization problems.Most of them possess the superlinear convergence property as their dense versions.Sparse quasi-Newton methods have become important iterative methods for solving large scale nonlinear equations and optimization problems.So far,the study in the sparse quasi-Newton methods focuses on the local convergence of the methods,seldom is related to the global convergence.A major difficulty in the study of the global convergence of the sparse quasi-Newton methods lies in that some nice properties of the ordinary quasi-Newton methods such as the positive definiteness and the least change property have been changed.Consequently,the study in the global convergence of the sparse quasi-Newton methods become much more difficult.In this thesis,we first investigate the properties of the sparse quasi-Newton methods.By the use of line search,we study the global convergence of some sparse quasi-Newton methods for solving nonlinear equations and optimization problems.We first study the sparse quasi-Newton methods for solving nonlinear equations.The Schubert method is a well-known sparse quasi-Newton method for solving sparse systems of nonlinear equations,which is an extension of the well-known Broyden's rank one quasi-Newton method to sparse problem.A good property of Schubert's method is that the quasi-Newton matrix generated by the method can retain the same sparse structure as that of the Jacobian of the system.But Schubert's method usually performed worse than Broyden's rank one method,which may be caused by the feature that the sparsity constraints are too strong,yielding the excessive modification of the approximate Jacobian.In this thesis,we concentrate on the so called partially separable nonlinear equations.We use a technique similar to the partitioned BFGS method to propose two partitioned quasi-Newton methods for solving partially separable nonlinear equations.If the computation of the derivative of the equation is not permitted,we propose a partitioned Broyden's rank one method.The quasi-Newton matrices generated by the method can preserve the sparsity structure of the Jacobian of the equation approximately.In the case where the derivative of the equation is available,with the help of the automatic differentiation,we propose a partitioned adjoint Broyden method,which also can preserve the sparsity structure of the Jacobian approximately and maintain the liner contravariant of adjoint Broyden method.Under appropriate conditions,we prove the global convergence of the two methods.Moreover,we show that after finitely many iterations,the unit step-length will be accepted.This means that the method locally reduces to the full step method and consequently,they are superlinearly convergent.We do numerical experiments to test the proposed methods and compare their performance with that of the Broyden's rank one method and adjoint Broyden method in terms of the number of iteration,the number of function evaluations and CPU time.The results show that the proposed methods outperform the compared methods.We also compare the performance of the propose methods with the Schubert's method.The results further positively supported the proposed methods.We then study sparse quasi-Newton methods for solving partially separable unconstrained optimization problems and propose a partitioned PSB method.By the use of the special structure of the Hessian matrix of the objective function,we propose a partitioned PSB update.Taking into account that the quasi-Newton matrix generated by the method may not be positive definite,which will lead the quasi-Newton direction to be a non-descent direction of the objective function,we construct a sufficient descent direction by introducing a projection strategy.Under appropriate conditions,we show that the partitioned PSB method with Armijo line search or Wolfe line search is globally and superlinearly convergent for the minimization of a uniformly convex function.We then do numerical experiments to test the proposed method and compare it with the well-known partitioned BFGS method and the limited memory BFGS(L-BFGS)method on 30 test problems.The results show that the proposed projected partitioned PSB method performs well in the number of iteration,the number of function evaluations,the number of gradient evaluations and the CPU time.At last,we study quasi-Newton methods for solving symmetric nonlinear equations and propose two class of quasi-Newton methods.Similar to the derivation of the PSB update,we derive a new class of quasi-Newton updates called symmetric adjoint Broyden updates.The symmetric adjoint updates share some nice properties with the non-symmetric versions such as the least change property and heredity on affine system.We show that by the use of a line search technique,the symmetric adjoint Broyden method with adjoint Broyden tangent update is globally and superlinearly convergent.We also propose a class of rank two quasi-Newton methods based on adjoint Broyden updates for solving symmetric nonlinear equations.The new rank two quasi-Newton updates not only share some nice properties such as positive definiteness and least change property as the BFGS method,but also can guarantee that the quasi-Newton matrix approximates Jacobian matrix of the equation along quasi-Newton direction exactly.Under appropriate conditions,the proposed rank two quasi-Newton method is proved to be globally and superlinearly convergent.Our numerical results indicate that the proposed two methods outperform the well-known BFGS method in the solution of symmetric nonlinear equations.
Keywords/Search Tags:Partially separable nonlinear equations, Partially separable optimization problems, Sparse quasi-Newton methods, Global convergence, Superlinear convergence
PDF Full Text Request
Related items