Font Size: a A A

Conjugate Gradient Algorithms For Unconstrained Optimization And Nonlinear Monotone Equations

Posted on:2016-07-05Degree:MasterType:Thesis
Country:ChinaCandidate:J W LiFull Text:PDF
GTID:2180330470981695Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Due to its lower memory requirement, simple iterative form, and fast convergence rate, conjugate gradient algorithm has been received much attention and has been widely used to solve practical problems. Based on some existing research results, this thesis proposes a couple of nonlinear conjugate gradient algorithms to solve unconstrained min-imization problems and nonlinear monotone equations. We show that each algorithm converges globally and performs well by a series of numerical experiments.In Chapter one, we introduce the preliminary results of unconstrained optimiza-tion, which mainly includes the definitions of optimization solution, line search rules, and descent algorithms. We recall some recent developments of descent conjugate gradient methods and quasi-Newton methods. Specially, we list some optimization algorithms for nonlinear monotone equations. Most importantly, we briefly review the main contribution of the thesis and list some symbols which used at the context.In Chapter two, we modify the descent conjugate gradient method of Xiao, Song, and Wang for unconstrained minimization. Both proposed algorithms used the Armijo line search instead of the Wolfe line search. We establish the convergence result in minimizing non-convex problems. Finally, we test both algorithms using a series of problems from CUTEr library which show that both algorithms are stable and efficient.In Chapter three, based on the projected Newton method of Solodov and Svaiter, we extend the descent conjugate gradient method of Dai and Kou to solve convex constrained monotone equations. Under some mild conditions, we show that the proposed method converges globally. Finally, performance comparisons show that the proposed method is practical, efficient and competitive with the well-known solver CGD.In Chapter four, we list some concluding remarks and research topics.
Keywords/Search Tags:unconstrained optimization, convex-constrained monotone equations, quasi-Newton algorithm, conjugate gradient method, line search
PDF Full Text Request
Related items