Font Size: a A A

Algorithms For Complementarity Problems And Nonsmooth Convex Minimization

Posted on:2013-06-26Degree:DoctorType:Dissertation
Country:ChinaCandidate:Q LiFull Text:PDF
GTID:1220330374991222Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis, we study numerical algorithms for nonlinear complementarity problems and nonsmooth convex minimization. To solve nonlinear complemen-tarity problems, we propose some algorithms based on equation reformulations to nonlinear complementarity problems for. By the use of Moreau-Yosida regular-ization, we propose a class of conjugate gradient type algorithms and a spectral conjugate gradient type algorithm for solving nonsmooth convex minimization. We establish the global convergence for all the above algorithms. We also do some numerical experiments to test the proposed algorithms. The results show the effectiveness of them.In Chapter2, we derive a semismooth equation reformulation to the nonlinear complementarity problem, which is an almost smooth equation. The reformulation function enjoys a nice property that it is continuously differcntiable everywhere ex-cept at the solution. Moreover, it is semismooth (almost smooth) at any solution of the problem. In particular, if the solution set of the problem is a singleton, then the reformulation function is a basic strongly almost smooth function. The reformulation reserves many good properties that the Fischer-Burmeister function based nonsmooth equation reformulation has, such as the boundness of level set and local/global error bound, etc.. Based on the almost smooth equation refor-mulation, we propose a Newton method for solving the nonlinear complementarity problem. Under appropriate conditions, we established the global and superlinear convergence of the method. The limited numerical results show the effectiveness of the proposed algorithm.In Chapter3, we propose a smoothing Newton method and a homotopy smoothing method for the nonlinear complementarity problems. Unlike existing smoothing functions, we first develop a new smoothing function to the general-ized derivative of the absolute value function. Then, we get the approximation to the absolute value function. The function possesses the Jacobian consistency property. Based on this function, we develop a smoothing Newton method and a homotopy smoothing method for solving the nonlinear complementarity problems, respectively. Under appropriate conditions, we establish the global and superlinear convergence of the two methods, respectively. We also show that when applied to solve a linear complementarity problem, the homotopy smoothing method termi-nates at a solution after finitely many iterations. In Chapter4, we are concerned with the derivative-free method for solving symmetric nonlinear complementarity problems. We first transfer the problem into an equivalent nonsmooth equation. We then extend two recently developed modified PRP conjugate gradient methods to solve this nonsmooth equation. The methods are derivative-free and norm descent. Under mild conditions, we show that both methods are globally convergent. We also report preliminary numerical experience to show the efficiency of the methods.In Chapter5, by the use of the Moreau-Yosida regularization, we first transfer the nonsmooth convex minimization to smooth convex minimization. We then propose a class of implementable conjugate gradient type methods. The proposed methods make use of approximate function and gradient values of the Moreau-Yosida regularization instead of the computation of their exact values. We first study the common property of these methods. Then, we concern with three specific conjugate gradient type methods, and establish their global convergence for the three algorithms, respectively. Compared with existing methods, our algorithms are easier to implement and can be used to solve large scale problems.In Chapter6, by making full use of inherent properties of Moreau-Yosida regularization, we first transfer the nonsmooth convex minimization to smooth convex minimization. Then we propose a easily implementable spectral-type con-jugate gradient method for nonsmooth convex minimization. The proposed meth-ods make use of approximate function and gradient values of the Moreau-Yosida regularization instead of the corresponding exact values. This algorithm is globally convergent under mild conditions.This dissertation was supported by the major project of the Ministry of Ed-ucation of China (309023) and the National Natural Science Foundation of China (11071087).
Keywords/Search Tags:nonlinear complementarity problem, nondifferentiableconvex minimization, Newton’s method, derivative-free methods, conjugate gradient type methods, global convergence
PDF Full Text Request
Related items