Font Size: a A A

A Descent Algorithm With Step Size For Structured Problem And Applicaiton

Posted on:2019-10-09Degree:MasterType:Thesis
Country:ChinaCandidate:Y N ZhangFull Text:PDF
GTID:2370330572455304Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In recent years,the research and development of optimization problems have received widespread attention.With the advent of the era of big data,the underlying model arising from many application problems such as engineering,information and biology,can be generalized as the equality constrained optimization problem.Because the objective function may not be smooth and the scale of the problem is often particularly large,the traditional interior point algorithms are not suitable.These problems can be better solved by first-order algorithms such as the augmented Lagrangian method or the ADMM.The ADMM utilizes the separability of two blocks of variables in the objective function to transform a larger subproblem into two smaller subproblems,which greatly simplifies the computation of subproblems.Due to the important application value of these problems,some scholars have proposed a series of improved ADMM algorithms in recent decades.Among them,the ADMM descent algorithm(DADMM)can improve the convergence speed of the ADMM under the premise of keeping the computation of each step basically unchanged.In addition,some scholars suggested to use a random step size to speed up the convergence speed of the projection and contraction algorithm.In this thesis,based on the ADMM descent algorithm,we propose a new ADMM descent algorithm with random step size(RDADMM)to further improve the convergence speed.For the case that the objective function has a special structure,some scholars have proposed the linearized ADMM algorithm(LADMM).Invoking the structure of the objective function,the LADMM makes it have explicit solution by linearizing the subproblems of the ADMM to simplify the subproblems.Based on this algorithm,some scholars have combined the LADMM with the DADMM and propose a new linearized ADMM descent algorithm,which improves the convergence speed of the LADMM under the premise of maintaining explicit solutions of the subproblems.In this thesis,based on the linearization ADMM descent algorithm,we propose a new linearized ADMM descent algorithm with a random step size(RDLADMM).By solving the quadratic programming numerical experiments with equality constraints,the computational efficiency of the new algorithm is verified under mild assumptions.The details are organized as follows:In Chapter 1,we mainly discuss the background and significance of the selected topic,including the domestic and international research status of the alternating direction method of multipliers,and the main work of this thesis.In Chapter 2,the theoretical results are introduced.In order to solve the structured convex optimization problem with two blocks of variables,we propose two new algorithms RDADMM and RDLADMM.In Chapter 3,the convergence proofs and analysis of the new algorithm are given.In Chapter 4,we derive the O(1/t) convergence rate of the new algorithm.In Chapter 5,the numerical experimental results are given and efficiency of the algorithm applied to the structured convex optimization problem with two blocks of variables is confirmed.Finally,a summary of this thesis and comments on the prospect study for the future research are drawn in the last Chapter.
Keywords/Search Tags:variational inequalities, alternating direction method of multipliers, the linear alternating direction method of multipliers, random step size
PDF Full Text Request
Related items