Nonsmooth optimization, named as nondiferentiable optimization, is refers tothe objective function and constraints which are not all diferentiable function op-timization problem. Nonsmooth optimization has been used in many areas, andthe main difculty of the solution of a nonsmooth optimization problem is generallythat the descent algorithm based on gradient(or subgradient) information cannotguarantee convergence. This paper mainly focuses on the unconstrained nonsmoothoptimization problem.In order to overcome the difculty of the solution of a nonsmooth optimizationproblem, smooth techniques are studied to solve several kinds of nonsmooth opti-mization problems in this paper, i.e., nonsmooth problem is approximately trans-formed to smooth problem by using the smooth methods. Firstly, Smooth techniquesto solve unconstrained finite minimax problems is studied, and two algorithms aregiven. Secondly, the paper studied on the smallest enclosing ball problem, and twoalgorithms are proposed to solve it. Finally, the paper studied the smooth tech-niques to solve the support vector machine problem. The main research results ofthe paper are described as follows:(1) The study on the smoothing techniques of unconstrained finite minimax prob-lems. The unconstrained finite minimax problems is equivalently convertedinto a nonsmooth unconstrained optimization problem firstly, then constructa new smooth function, and put the nonsmooth unconstrained optimizationproblem into a class of smooth unconstrained optimization problems with s-moothing parameters, and the limitation of them is the unconstrained finiteminimax problem (the use of homotopic thought). The new smoothing func-tion ensures the gradient and Hessian matrix of the smooth unconstrainedproblem which are the sparse linear combination of gradient and Hessian ma-trix of the original diferentiable functions, so the quality of calculation canbe greatly reduced. In addition, using incomplete Cholesky decomposition,combined with Newton method in solution of the smooth unconstrained opti-mization, a modified Newton algorithm is proposed to solve the finite minimaxproblem. Numerical results show that the algorithm is efcient for solving un-constrained minimax problem. (2) The study on the smooth techniques of large-scale unconstrained finite min-imax problems. Smooth Newton type of algorithm needs to calculate theHessian matrix. For more large-scale unconstrained finite minimax problem-s, Hessian matrix calculation and storage are very large. However, the trustregion Newton conjugate gradient method, does not need to fully calculatethe Hessian matrix. Actually, it is only need to calculate matrix and vectorproduct which can greatly reduce computation cost and storage. Hence, fora large-scale finite minimax problem, a trust region Newton conjugate gradi-ent algorithm is proposed to solve the corresponding smooth problem. Thenumerical results show that the method is efcient.(3) The study on the application of the finite minimax problem. For the specialstructure of the smallest enclosing ball problems, we can easily least the s-mallest enclosing ball problem into an unconstrained finite minimax problem.So far, most of existing algorithms can greatly solve the small-scale ball prob-lem. However, for the large-scale of the smallest enclosing ball problems, inorder to avoid calculating Hessian matrix of object function to save memory,BFGS iteration is used to update the Hessian matrix, and a smooth algorithmbased on the limited memory BFGS is given in the paper. This quasi-newtonmethod can greatly reduce the computation and storage of the requirementin practical calculation. Numerical results show that limited memory BFGSmethod for solving large-scale smallest enclosing ball problem is efcient.(4) The study on the smallest enclosing ball problems. After the smallest enclos-ing ball problem is converted into a smooth unconstrained problem, anothermethod called trust region Newton conjugate gradient algorithm is proposed tosolve it, where the Hessian matrix does not need to calculate and the solutionis updated by Newton direction with trust region search. From the numericalresults, the algorithm has a fast local convergence speed.(5) The study on the smoothing method for the support vector machine (SVM)problem. As is we known, support vector machine generally require to solve aconvex quadratic programming problem with constraints, using the structurecharacteristics, the convex quadratic programming problem can be convertedinto an unconstrained optimization problem with function max(0, x). In ad-dition, entropy function adopted to transformed the problem into a smooth unconstrained optimization problem approximately, and a new smooth algo-rithm is given to solve the smooth problem. The error bounds of the newmethod is O(1p2), moreover, we prove that SSVM algorithm also has a tightererror bounds O(1p2). Numerical experiment results show that the new smoothalgorithm for support vector machine problems is efcient. |