Font Size: a A A

Filter Method For Solving Nonlinear Optimization Problems

Posted on:2010-11-29Degree:DoctorType:Dissertation
Country:ChinaCandidate:C GuFull Text:PDF
GTID:1110360302492019Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The study of theory and methods of nonlinear (un)constrained optimization is composed of two parts, i.e., global convergence and local convergent rate. Both line search technique and trust region strategy are well-accepted methods in the optimization in order to assure global con-vergence. At the same time, accompany with the development of computer and the perfect of software, numerical experiments of optimization problems are becoming more and more feasible and effective.This thesis is devoted to studying systematically the algorithms of nonlinear constrained optimization, including with equality constraints, nonnegative constraints, general constraints. The algorithm of nonlinear systems of equalities and inequalities is also considered. By employing the filter idea introduced by Fletcher and Leyffer, we propose several efficient line search filter method for solving nonlinear constrained optimization and nonlinear systems of equalities and inequalities.Fletcher and Leyffer presented filter methods for nonlinear programming (NLP), offering an alternative to merit functions, as a tool to guarantee global convergence of algorithms for nonlinear programming. The underlying concept is that trial points are accepted if they improve the objective function or improve the constraint violation. Previous theoretical work on filter methods has considered trust region algorithms and only the question of global convergence. So we extend the line search filter method to two secant algorithms for nonlinear equality constrained optimization by employing the Lagrangian function (?)(x,λ) (instead of the objective function f(x)) in the filter. Besides global convergence, we prove our approach converges locally two-step Q-superlinear, even without an additional second order correction that is needed by many algorithms to avoid the Maratos effect.Recent research indicates that the monotone line search technique may have some drawbacks. In particular, enforcing monotonicity may considerably reduce the rate of convergence when the iteration is trapped near a narrow curved valley. Grippo et al. generalized the Armijo rule and proposed a nonmonotone line search technique for Newton's method which permits increase in function value, while retaining global convergence of the minimization algorithm. Several nu-merical tests show that the nonmonotone line search technique is efficient and competitive. We propose a filter secant method with nonmonotone line search for nonlinear equality constrained optimization. The main contribution of our paper is to employ the nonmonotone idea to the suf-ficient reduction conditions and filter. This new method has more flexibility for the acceptance of the trial step and requires less computational costs compared with the monotone one.A reduced Hessian successive quadratic programming algorithm has been shown to be suc-cessful in solving large-scale nonlinear optimization, especially for problems with few degrees of freedom. Based on reduced Hessian algorithm and nonmonotone technique presented by Yu and Pu, we propose a nonmonotone filter reduced Hessian method for nonlinear equality constrained optimization. Similar to Chapter 2, the Lagrangian function (?)(x,λ) is employed in the filter. The global and local convergence is given under reasonable assumption. The numerical results show that the nonmonotone algorithm is more effective than monotone one and does not suffer from the Maratos effect for all problems tested here, i.e., unit steps are accepted in last several iterations.Growing interest in efficient optimization methods has led to the development of interior-point or barrier methods for large-scale nonlinear programming. In particular, these methods provide an attractive alternative to active set strategies in handling problems with large numbers of inequality constraints. We propose a filter interior-point method for nonlinear equality and nonnegative constrained optimization. Some numerical experiments to show the effectiveness of the proposed algorithm.Filter methods have been extended by Gould, Leyffer and Toint to the nonliear equations and nonlinear least squares and by Gould, Sainvitu and Toint to the unconstrained optimization problem with multidimensional filter technique. Wachter and Biegler presented line search filter methods for nonlinear equality constrained programming. Since many problems contain both equality and inequality constraints, we generalize Wachter-Biegler methods for solving general nonlinear programming based on the multidimensional filter technique. Under mild conditions, the global convergence and local superlinear convergence of the method are obtained. If nonlinear programming only has equality constraints and M= 1, the method actual is the Wachter-Biegler line search filter methods. The numerical results show that the new method is effective compared with SNOPT.Systems of nonlinear equalities and inequalities appear in a wide variety of problems in ap-plied mathematics. These systems play a central role in the model formulation design and analysis of numerical techniques employed in solving problems arising in optimization, complementarity, and variational inequalities. As application, we propose a nonmonotone filter method for nonlin-ear systems of equalities and inequalities which guarantee global convergence and possess local superlinear convergent rate under suitable conditions. From the numerical results, we can see that our algorithm performs well compared with two trust-region methods and full stepχκ+1=χκ+dκorχκ+1 =χκ+dκ+dκsoc is accepted in last several iterations so the sequence{xk} converges to x* Q-superlinearly. Finally, we conclude the main new research results of this thesis and propose some further research direction about our work.
Keywords/Search Tags:Optimization, Filter, Line search method, Nonmonotone technique, Interior point method, Global convergence, Local convergent rate, Maratos effect
PDF Full Text Request
Related items