Font Size: a A A

Conjugate Gradient Methods And Self-Scaling Quasi-Newton Methods For Optimization

Posted on:2009-04-07Degree:DoctorType:Dissertation
Country:ChinaCandidate:W Y ChengFull Text:PDF
GTID:1100360242990312Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis,we study nonlinear conjugate gradient methods and self-scaling quasi-Newton methods for solving unconstrained optimization problems and systems of nonlinear equations.We investigate the global convergence property and numerical performance of these methods.In Chapter 2,we propose a class of modified conjugate gradient methods for solving unconstrained optimization problems.A common property of these methods is that the direction generated by any member of the class satisfies g_k~Td_k=-‖g_k‖~2 which implies that the generated direciton d_k is sufficient descent for the objective function.Moreover, if the line search is exact,the modified methods reduce to the related standard conjugate gradient methods.We pay particular attention to the modified YT and YT+ methods. We show that the YT method with standard Armijo line search is globally convergent for uniformly convex problems.We also prove that the modified YT+ method with strong Wolfe line search converges globally for nonconvex problems.Extensive numerical experiments show that the proposed method outerperforms the CG_DESCENT method.In Chapter 3,we propose a modified PRP method.We show that the modified PRP method with an Armijo-type line search is globally convergent.We also establish the global convergence of the modified PRP+ method with strong Wolfe line search.The reported numerical results show that the proposed methods perform better than existing conjugate gradient methods.In Chapter 4,based on the three-term modified PRP method[16]and the modified PRP method proposed in Chapter 2,we propose a family of derivative-free conjugate gradient methods for solving large-scale nonlinear systems of equations.Under appropriate conditions,the global convergence of the proposed method is established.Preliminary numerical results show that the proposed method is competitive with the DF-SANE method [90].In Chapter 5,we scale the quasi-Newton equation and propose a spectral-scaling BFGS method.The method has a good self-correcting property and can correct large eigenvalues.Compared with the standard BFGS method,the spectral-scaling BFGS method does not cause deterioration in the single-step convergence rate when minimizing an n-dimensional quadratic function.In addition,when the method with exact line search is applied to minimize an n-dimensional strictly convex function,it terminates within n steps.Under appropriate conditions,we show that the spectral-scaling BFGS method with Wolfe line search is globally and R-linearly convergent for uniformly convex problems.The reported numerical results show that the spectral-scaling BFGS method outperforms the standard BFGS method.In Chapter 6,we extend the results attained in Chapter 5 to the Broyden's class of quasi-newton method and establish the global convergence of the methods.In Chapter 7,based on the spectral-scaling BFGS method,we propose a spectral-scaling limited memory BFGS method for large-scale unconstrained optimization.We prove that the method is globally convergent for uniformly convex problems.Numerical experiments indicate that the method is competitive with the standard limited memory BFGS method.
Keywords/Search Tags:Nonlinear conjugate gradient method, self-scaling quasi-Newton method, global convergence
PDF Full Text Request
Related items