Font Size: a A A

Algorithm Study On Three Types Of Nonconvex Nonsmooth Optimization Problems With Special Structure

Posted on:2022-08-07Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y C LiuFull Text:PDF
GTID:1480306320981999Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
Recently,a large number of weakly convex,large scale nonconvex non-smooth and constrained sparse optimization problems with special structure have drawn great attention among many fields,such as data processing and dimensionality reduction,wireless sensor networks,signal and image processing,machine learning,and so on.It is a hot topic in the field of optimization to design a simple but efficient algorithm according to the special structure and property of these problems.In this thesis,we mainly focus on three types of nonconvex and nonsmooth problems with special structure.Based on the incremental algorithm,smoothing method,Moreau envelope function and proximal gradi-ent algorithm,we customize some suitable algorithms and study their convergence as well as complexity.Firstly,we focus on the problem of minimizing the sum of nonconvex smooth compo-nent functions and a nonsmooth weakly convex function composed with a linear operator.We propose a variable smoothing incremental aggregated gradient(VSIAG)algorithm for solving this problem.In particular,using the Moreau envelope with a decreasing sequence of smoothing parameters,and combined with the incremental aggregate gradient algorith-m,we propose the VSIAG algorithm.Compared with the related work,VSIAG avoids calculating the full gradient at each iteration,thereby reducing the complexity of calculat-ing the gradient.On the other hand,our method is developed relying on a special auxiliary function sequences to reveal the sufficient descent of the iteration sequences,and we prove a complexity of O(?-3)to achieve an ?-approximate solution.Secondly,we study the proximal incremental aggregated gradient(PIAG)method for minimizing the sum of smooth component functions and a nonsmooth function,both of which are allowed to be nonconvex.We show the linear convergence rate result under the error bound condition,and relax the convexity assumption condition.Our method is devel-oped based on a special auxiliary Lyapunov function sequence to reveal the convergence behavior of PIAG.We show that this auxiliary sequence is Q-linearly convergent,then using this result we obtain that the objective value and iterative sequences are R-linearly convergent.Preliminary computational experience is also reported.Finally,we consider the proximal variable smoothing method for nonconvex and non-smooth constrained sparse optimization problems(NNCSOP),where the objective func-tion is given by the sum of nonconvex smooth component functions,a composition of a weakly convex function with a linear operator and a nonsmooth convex function.At first,we discuss the special case of NNCSOP with single smooth function.We propose a variable smooth proximal gradient method(VSPG)to solve this problem,and prove a complexity of O(?-3)to achieve an ?-approximate solution.Furthermore,relying on VSPG,we propose a variable smoothing proximal incremental aggregated gradient(VSPI-AG)algorithm for solving NNCSOP,and prove its sublinear convergence rate.VSPG and VSPIAG methods only need to calculate the proximity operator of two nonsmooth func-tions separately,and do not need to calculate the proximity operator of the sum of two nonsmooth functions,thereby reducing the difficulty of solving this problem.Preliminary numerical experiments on nonnegative sparse principal component analysis illustrate the efficiency of VSPIAG.
Keywords/Search Tags:Nonconvex nonsmooth problem, Incremental algorithm, Smoothing method, Convergence, Proximal gradient algorithm
PDF Full Text Request
Related items