Font Size: a A A

Numerical Methods For Nonlinear Monotone Equations And Nonsmooth Optimization Problems

Posted on:2016-03-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y P HuFull Text:PDF
GTID:1220330467976661Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis, we focus on finding new methods for solving large-scale nonlinear monotone equations, nonsmooth convex optimization problem and matrix l2,1-norm minimization prob-lem. Under suitable conditions, the global convergence of the given algorithms are established. Some numerical results show that the new methods are efficient.In Chapter2, based on the hyperplane projection technique, a modified Liu-Storey conju-gate gradient algorithm is proposed for solving nonlinear monotone equations. This algorithm possesses the following properties:(i) The search direction of the proposed method possesses sufficiently descent property;(ii) The method is a derivative-free method which can be applied to solving large-scale problems due to its lower storage requirement. Preliminary numerical results show that this algorithm is very promising. Furthermore, based on Lipschitz constant and projection technique, a Wei-Yao-Liu conjugate gradient projection algorithm is given in which the sufficient descent condition can be satisfied at each iteration, and proved to be global convergent.In Chapter3, a Wei-Yao-Liu conjugate gradient projection algorithm is proposed to solve nonlinear monotone equations with convex constraints. The new method does not need the computation of the derivative as well as the solution of some linear equations. Under some suitable conditions, we can establish its global convergence results. Preliminary numerical results show that the proposed method is efficient and promising.By using the Moreau-Yosida regulation approach, proximal method and a nonmonotone line search technique, we propose multivariate spectral gradient algorithm, modified Liu-Storey and Wei-Yao-Liu conjugate gradient algorithm for nonsmooth convex minimization in Chapter4. All the methods possess a significant property:the directions generated by the methods are sufficiently descent under nonmonotone line search, which guarantees the global convergence of the proposed methods.In Chapter5, in order to get more efficient methods for matrix l2,1-norm minimization problem, inexact alternating direction algorithm with gradient technique as well as three inexact accelerated proximal gradient algorithms are given. These methods are globally convergent under suitable conditions. Numerical results show that these methods are efficient.Some conclusions and a perspective of future research work are discussed in the Chapter6.
Keywords/Search Tags:nonlinear programming, nonsmooth problem, matrix norm minimization, conju-gate gradient method, alternating direction method, global convergence
PDF Full Text Request
Related items