Font Size: a A A

Resarch On Splitting Algorithms For Two Kinds Of Nonconvex Separable Optimization Problems

Posted on:2024-01-23Degree:MasterType:Thesis
Country:ChinaCandidate:D L CuiFull Text:PDF
GTID:2530307124983879Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Nonconvex separable optimization problem has many applications in practical engineering such as background extraction,data mining,image processing,power system economic dispatch,etc.This dissertation considers two kinds of nonconvex separable optimization problems: two-block separable nonconvex optimization problem with nonlinear constraints and affine linear constraints and three-block nonconvex optimization problem with linear constraints.First,a double-step-length symmetric splitting sequential quadratic optimization(DSLSS-SQP)algorithm for solving two-block nonconvex optimization with nonlinear constraints is proposed.With the help of the idea of symmetric splitting,the quadratic optimization(QP)subproblem approximating the discussed problem is split into two small-scale QPs,which can which can generate two improved search directions for the two primal variables.The augmented Lagrangian function is used as the merit function,and two step sizes are yielded by performing the Armijo line search along the two improved directions.Under mild conditions,the global convergence,strong convergence,iterative complexity,and Maratos effect of the DSL-SS-SQP algorithm are proven.Some numerical results are reported,comparisons with the results obtained by the IPOPT solver are also provided,which show that the proposed algorithm is efficient.By comparing some numerical results with results obtained by the IPOPT solver,it show that the proposed algorithm is efficient.Secondly,a partially symmetric linearized Bregman alternating direction method of multipliers(ADMM)for three-block nonconvex optimization is proposed.The third original variable is updated twice per iteration and the linearization method is introduced in the first iteration.Meanwhile,we introduce Bregman distance to the subproblems with other original variables.The proposed method updates the Lagrange multiplier twice with the different relaxation factors in each iteration.The global convergence of the proposed algorithm are proved under the suitable conditions.The strong convergence as well as the rate of convergence of the proposed algorithm are proved under the Kurdyka–?ojasiewicz property.Finally,we apply the proposed method to solve the principal component analysis(PCA).Numerical results show the effectiveness of the proposed algorithm.
Keywords/Search Tags:nonconvex separable optimization, sequential quadratic programming, alternating direction method of multipliers, nonlinear constraints, linear constraints, convergence and rate of convergence
PDF Full Text Request
Related items