Font Size: a A A

Results On Some Problems In Quasi-Differentiable Analysis And Optimization: Kernels·Convexificators·Optimality Conditions

Posted on:2006-11-20Degree:DoctorType:Dissertation
Country:ChinaCandidate:C L SongFull Text:PDF
GTID:1100360152485498Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
This dissertation studies the calculus theory of quadifferentiable functions in two ways, one of which is from the way of quasidifferential kernels, the other one is from the way of convexificators, and the necessary optimality conditions for constrained quasidif-ferentiable optimization are discussed. The main results may be summarized as follows:1. Chapter 3 establishes two formulae (equality) of Demyanov difference of convex compact sets under the assumption of orthogonal complementarity, which are useful to compute Demyanov difference of subdifferential and negative superdifferential (Demyanov sum of quasidifferential) for the summation functions and the maximal (minimal) functions. In order to solve the quasidifferential uniqueness for quasidifferen-tiable functions, some properties of high dimensional kernels are given under the assumption that Demyanov difference of subdifferential and negative superdifferential is consistent with Minkowski difference of those, and a special class of quasidiffer-entiable functions with high demensional kernels (in the sense of Yan Gao), sub-superdifferentiable functions, is presented. For a special class of quasidifferentiable optimization-D. C. optimization, the convergence analysis of the steepest descent algorithm is obtained.2. Chapter 4 proposes the concept of convexificator kernels for quasidifferentiable functions, which can be used to tell whether a quasidifferentiable function is subdiffer-entiable. Concrete convexificators of quasidifferentiable functions are given by the recession functions of positively homogeneous functions. We can show that the convexificators are consistent with Demyanov sum of quasidifferentials.3. Chapter 5 is devoted to the study of constrained quasidifferentiable optimization with equality, inequality and abstract constraints. Under the mild assumption (existing a pair of quasidifferential such that the subdifferential and the superdifferential are upper semi-continuous), Fritz John necessary optimality conditions via Demyanov difference are obtained by Ekeland principle. As an application, necessary optimality conditions for bilevel programming are given. If the condition of being of maximal rank is added, KKT necessary optimality conditions can be established. Moreover, Fritz John necessary optimality conditions via sublinear functional are given without any assmuption. The above two results are both independent of the choose of the supergradients.
Keywords/Search Tags:nonsmooth (nondifferentiable) optimization, quasidifferentiable optimiza-tion, quasidifferentiable functions, quasidifferentials, quasidifferential kernels, convexificators, convexificator kernels, recession functions, Demyanov difference
PDF Full Text Request
Related items