Font Size: a A A

Accelerated Alternating Direction Method Of Multipliers For Solving Separable Convex Optimization Problems And Applications

Posted on:2022-12-17Degree:MasterType:Thesis
Country:ChinaCandidate:X Q MengFull Text:PDF
GTID:2480306764468364Subject:Computer Software and Application of Computer
Abstract/Summary:PDF Full Text Request
With the rapid development of data-intensive science in the era of“Internet Plus”,many large-scale problems in the practical world can be formulated approximately into convex optimization models.There is a great deal of literature on tackling convex op-timization problems with linear constraints,which has a wide range of applications in information science,image processing,machine learning,and other fields.Therefore,it is crucial to devise practical numerical algorithms to solve such an optimization prototype.There are many available methods to solve this kind of problem.This thesis is de-voted to the linearly constrained convex optimization problem with separable structure by deploying the canonical dual theory.The alternating direction method of multipli-ers method(ADMM)is a hot-investigated algorithmic framework to solve the Fenchel-Rockafellar dual problem.However,as the number of iterations increases,the computa-tional efficiency of ADMM will gradually slow down.Accordingly,This thesis is inter-ested in developing accelerated modifications of classical ADMM to abate the number of iterations when approaching the optima.Goldstein et al.developed an accelerated ADMM for two strongly convex subfunc-tions of the objective function for separable problems.By pursuing the track of Goldstein et al.'s work,exploring the iterative scheme of accelerated ADMM by weakening the pre-requisites on those component objective functions and penalty parameters.Specifically,this thesis uses the unique structure of its dual problem to weaken the convergence condi-tion in the ADMM under the accelerated version:(1)The objective function of the separable structure only requires one of the sub-functions to be strongly convex.(2)The requirement for the penalty parameter is further weakened,and the penalty parameter?is only required to be positive.In the theoretical analysis,this thesis proves that the accelerated ADMM can still ensure the algorithm's convergence under weakened conditions and theoretically obtain the better convergence results,i.e.,the dual residual dkis reduced from?dk?~2k?~2=0.In the numerical experiment part,applying the mild model to some practical prob-lems in optimization—image reconstruction problems based on wavelets.By solving the dual problem of the original optimization problem,the fast reconstruction of the image is achieved.The results of numerical experiments are compared with the existing classic and the accelerated linearized ADMM.According to the curve of the signal-to-noise ratio of the three algorithms with the number of iterations and the experimental reconstruction effect,the numerical validity and superiority of the novel accelerated ADMM are verified.
Keywords/Search Tags:Fenchel-Rockafellar Dual, Separable Convex Optimization, Alternating Direction Method of Multipliers, Acceleration Techniques, Image Reconstruction
PDF Full Text Request
Related items