Font Size: a A A

Filter Line Search Methods For Nonlinear Optimization

Posted on:2011-07-24Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z J WangFull Text:PDF
GTID:1100360302492022Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Line search method, which is simple and reliable, is one of the basic strategies to ensurethe global convergence of optimization method. The key points of the method are finding searchdirection and identifying step size. The search direction is used to determine the speed of conver-gence and the step size to ensure the convergence of the descent direction algorithm. In this paper,step size is determined by the Armijo criteria, and then the extension of line search methods isbased on that of the Armijo criteria.Filter algorithm is usually used in solving constrained optimization problems, and its mainconcept is that trial points are accepted if they improve the objective function or improve theconstraint violation. Filter technique obtains a good balance between the objective function andthe constraint violation, instead of the traditional penalty function method to ensure the globalconvergence of the algorithm. Filter methods can be applied to trust region sequential quadraticprogramming (SQP) method framework as well as line search method framework. Wa¨chter andBiegler proposed global convergence results for line search method based on filter algorithms.The thesis introduces the filter technique proposed by Fletcher and Leyffer, then incorporatefilter method into nonmonotone technique, projected reduced Hessian method, complete projectedsecant method, affine-scaling interior-point approach, interior-point barrier methods and so on. Wefocus on establishing filter line search algorithm framework and applying them to a few typicaloptimization problems. We analyze the global and local convergence of these proposed algorithmsand examine their practical results by numerical experiments.The filter methods are traditionally used for constrained nonlinear system and have beenextended creatively by Gould, Toint and Sainvitu to the unconstrained optimization with multidi-mensional filter trust region technique. By employing the relevant characteristics of unconstrainedoptimization, we can transform it into a special equality constrained minimization, then combiniethe filter line search with Newton's method, nonmonotone technique and MBFGS method for un-constrained optimization. The global convergence and fast local convergence rate of the proposedalgorithms are established under some reasonable conditions. The results of numerical experi-ments indicate that the proposed methods are superior to the classical line search algorithms.The secant methods proposed by Fontecilla are very successful for nonlinear equality con-strained optimization problem. Utilizing the DFP or the BFGS secant updates to approximate theHessian matrix of the Lagrange function, we can greatly reduce the storage space and the com-puting cost. The feature of these algorithms is that the improved secant algorithms are used to produce a search direction, a filter line search procedure to generate step size and second ordercorrection technique to overcome the Maratos effects. Besides global convergence, we prove ourapproach converges locally 2-step superlinear. The numerical results show the effectiveness ofour algorithms.A reduced Hessian successive quadratic programming algorithm has been shown to be oneof the successful method for large-scale nonlinear constrained optimization, which uses the partinformation of the Hessian matrix of Lagrange function, and has a small memory requirements ineach iteration. We present a reduced Hessian filter line search method for nonlinear equality con-strained optimization. The global convergence and fast local superlinear convergence rates of theproposed algorithm are established under some reasonable conditions. The numerical experimentsare reported to show the feasibility of the algorithm.Based on filter line search being favorable to the feasibility of inequality constraints, wecombine interior point projection and affine-scale projection technique according to the particularconditions of box constraints and linear inequality constraints, study the application of filter linesearch method to optimization with simple bounds and with linear inequality constraints, respec-tively. Under mild conditions, the global convergence and local superlinear convergence of theproposed algorithms are obtained. The numerical results show that the new algorithms have somepractical value.There are quite a few articles proposing interior-point methods for optimization with simplebounds or with linear inequality constraints. However, for solving effectively large-scale nonlinearequality and inequality constrained mixed optimization problem, the researches on interior-pointalgorithms are very few. We employ the idea and technique of Newton's method and secantmethod to the equality constraints, of interior point projection method and affine-scale projectiontechnique to the inequality constraints, solve comprehensively such problems by combining withfilter line search method. The algorithms avoid suffering from the Maratos effect by employing theLagrangian function and the constraint violation in the filter, which come from smooth Fletcher'spenalty function and are relative to equality constraints. We show that the proposed algorithmshave global convergence and fast local convergence rate. Numerical results indicate that thesealgorithms are feasible and effective.Finally, we conclude the main research results of this thesis and give some ideas of the futureresearch.
Keywords/Search Tags:Optimization, Filter technique, Line search method, Nonmonotone technique, Secant methods, Reduced Hessian approach, Newton's method, Interior point method, Globalconvergence, Local convergent rate, second order correction, Maratos effect
PDF Full Text Request
Related items