Font Size: a A A

Study On Several New Nonlinear Conjugate Gradient Algorithms And Analyse Their Global Convergence

Posted on:2011-12-01Degree:MasterType:Thesis
Country:ChinaCandidate:W CaoFull Text:PDF
GTID:2120360308958722Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Optimization is usually divided into two types of unconstrained and constrained. This paper focuses on the transformation to analysis the constrained optimal problem.Unconstrained optimization is the main base and effective way of the optimization. Nonlinear conjugate gradient algorithm is a common and effective algorithm, which can slove most of the problem of the large-scale unconstrained effectively. Whether in social science,natural science, production practice, or in the modern management, nonlinear conjugate gradient algorithm has been widely applied.This topic based on the research results from home and abroad. After careful analysis, elaboration, and validation, through selecting proper search criteria, improving parametersβk, and constructing a new search direction d k, to get the new algorithm. The new method inherits the achievements of predecessors, moreover, with the extensions for it .This paper studies several nonlinear conjugate gradient algorithms, which obtained in dissertation may be summarized as follows:1. The PRP method is generally believed to be the most efficient conjugate gradient method recently. The new PRP algorithm which has sufficiently descending property, and the new search direction can keep in a trust region automatically without carrying out any linear search rule. What is more, this algorithm possesses are superior to convergence property for nonconvex function and uniformly convex function and gives the proof of linear convergence rate.2. A modified conjugate gradient formulaβkMLSbased on the formula of the Liu -Storey (LS) nonlinear conjugate gradient method is proposed. It is proved that under the Wolfe-Powell line search and even under the strong Wolfe-Powell line search ,meanwhile the parameter (0, 1)σ∈2, the corresponding method has sufficient descent and global convergence properties. Numerical results show that the proposed method is very promising.3. We proposed a spectral conjugate gradient method by combining conjugate gradient method and spectral gradient methodIn, the direction generated by the method is a descent direction for the objective function, and this property depends neither on the line search rule, nor on the convexity of the objective function. Moreover, the modified method reduces to the standard CD method if line search is exact. Under some mild conditions, we prove that the modified method with line search is globally convergent even if the objective function is nonconvex. Preliminary numerical results show the proposed method is very promising.4. We are concerned with the MWYL method, which is based on the ideas of [9]. This algorithm has an interesting property is that it has the Property(*). And it is proved that under the strong Wolfe-Powell line search , the corresponding method has global convergence properties. The numerical results show that the proposed method is more promising than PRP+ method.
Keywords/Search Tags:unconstrained optimization, nonlinear conjugate gradient method, spectral gradient method, line search, global convergence
PDF Full Text Request
Related items