Font Size: a A A

Research On Two Methods For Solving Optimization Problems

Posted on:2007-01-23Degree:MasterType:Thesis
Country:ChinaCandidate:X P WuFull Text:PDF
GTID:2120360185487473Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Our main purpose in this paper is to study new conjugate gradient algorithm and new SQP method.In Chapter 1, we recall the foundational knowledge about conjugate gradient method and some famous researches, and describe the BFGS and BFGS-TYPE formulas.In Chapter 2, we propose a new kind of conjugate gradient-type formula for solving nonlinear unconstrained optimization problems, the new method definds the range of paremeters by using the famous foumula βkCD which expands the range of parameters comparing with the old method, and we have proved the new formula satisfies the sufficient descent condition and the corresponding algorithm with general Wolfe line search possesses the global convergence result.In Chapter 3, a new nonlinear conjugate gradient formula for solving unconstrained optimization problem is proposed, which is constructed by combining the famous formula βkFR with βkPRP, the new formula satisfies the sufficient descent condition, and the corresponding algorithm under the weak Wolfe condition is global convergent. Priliminary numerical results show that the method is promising.In Chapter 4, using the formula Ak(2) given by Wei, we propose a new Ak(2)-SQP method, which develepments the using of the quasi-Newton formula BFGS, and combines with SQP method to solve constrained optimization problems, the corresponding algorithm possesses global convergence and supei linear convergence property.
Keywords/Search Tags:Conjugate gradient method, Sufficient descent property, A_k(2)-SQP method, Line search, Global convergence, Superlinear convergence
PDF Full Text Request
Related items