| In this thesis, based on the flattened aggregate function, a class of active strategy set smoothing functions for max-function are proposed. The smoothing functions are only associat-ed with individual functions whose function values are close to the max-function value. Hence, compared with the aggregate function, the new smoothing functions have efficient computa-tional and storage efficiency. And some properties of the new smoothing functions are proved. Based on the smoothing functions, a smoothing Newton algorithm for solving finite dimensional unconstrained minimax problems is proposed. The algorithm adopts the feedback precision-adjustment rule, it ensures the precision parameter is kept large initially, and is decreased as the number of iterations of the resulting algorithm going to 0. Hence, it results in a much better management of ill-conditioning. At each iteration, only a small subset of the components in the max-function are involved, hence the number of gradient and Hessian calculations is reduced dramatically. Numerical results show the efficiency and stability of the proposed algorithm. Fi-nally, the proposed smoothing Newton algorithm is applied to solve the rotated parabolic fitting problem. |