The problem of nonconvex and nonsmooth block optimization is wide-ly found in practical applications such as compression perception,image and signal processing,tensor decomposition,etc..The alternating direction method of multipliers(ADMM)has been one of the most powerful and suc-cessful methods for convex problems which can be divided into blocks.How-ever,the convergence of the generalized ADMM for nonconvex problems can not be guaranteed.Therefore,it is of great theoretical and practical signifi-cance to study the convergence of the improved ADMM for the nonconvex block problems.In this thesis,we mainly study the construction and convergence of Breg-man ADMM for nonconvex and nonsmooth block optimization problems.The main contents are as follows:Firstly,we propose a Bregman ADMM for nonconvex and nonsmooth three-block problems with objective function cou-pled.Under the condition that the potential function satisfies the Kurdyka-Lojasiewicz inequality property and the penalty parameter is larger than a constant,the convergence of the algorithm is analyzed,and then the conclu-sion is extended to the multi-block case.Secondly,on the basis of Bregman ADMM,the linearization idea is introduced.Finally,the linear Bregman ADMM is proposed and the convergence is analyzed. |