This thesis analyzes the nonlinearly constrained optimization problems. Considering the algorithms for the nonlinearly constrained optimization problems, it focuses on researching the sequential quadratic programming (SQP) and trust region algorithm, which have a lot of successful results. Improved algorithms based on these successful research. The improved algorithms reduce the computation and have global and superlinear convergence.In the first chapter, an improved SQP algorithm for solving the nonlinearly inequality constrained optimization problems is presented. The algorithm reduces the computation by restricting the index set and using a feasible descent direction. In this way, the direction is feasible and descent. The improved algorithm has less computation. It also has global and superlinear convergence.In the second chapter, an improved trust region algorithm for nonlinear constrained optimization is given, which combines with an assistant descent direction. This method gets the search direction by solving a quadratic programming problem. It doesn't go back to the step of solving quadratic programming problem when the search direction is not descent. Instead, modified direction is feasible and descent. By some assume its global convergence and superlinear convergence are proved. |