In this paper, we discuss a kind of optimization problems with nonlinear inequality constraints. Basing on the ideas of norm-relaxed SQP method and strongly sub-feasible direction method of J.B. Jian etc. (Applied Mathematics and Computation, 182, pp. 955-976, 2006), we propose a new superlinearly convergent algorithm without the strict complementarity. Unlike the previous work, the norm-relaxed QP subproblem in our algorithm is concerned with only the constraints corresponding to an estimate of the active set, and the constraints not in the estimate of the active set are neglected. In addition, the high-order correction direction, which can avoid the Maratos effect, is yielded by solving a system of linear equations which also only include the estimate of the active constraints and their gradients. Consequently, the scale and the computational cost of the the high-order correction directions are further decreased. In particular, since the step search technique used in our algorithm can effectively combine the initialization with optimization processes, after finitely many steps, the iteration point always gets into the feasible set. The new algorithm possesses global convergence under relatively weaker assumption-Mangasarian-Fromovitz constraint qualification (MFCQ), and superlinear convergence under some mild assumptions without the strictly complementarity. Finally, we also report some numerical experiments to show that the proposed algorithm is practicable and promising for the test problems.
|