Font Size: a A A

A Study On Some Problems Of Alternating Direction Method Of Multipliers

Posted on:2019-12-31Degree:MasterType:Thesis
Country:ChinaCandidate:Q G ChenFull Text:PDF
GTID:2370330551459986Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Due to the explosive growth of data,the original machine learning algorithms have not dealt with the emerging challenges.However,the distributed algorithms have become a method to solve big data problem-s.Therefore,it is very important to combine distributed algorithms with traditional machine learning algorithms.The alternating direction method of multipliers?ADMM?is an excellent distributed algorithm and the re-search in this paper is based on it.We research the optimal step size of the l1problem,the ADMM algorithm of the distributed support vector ma-chine,and the ADMM algorithm of the online distributed support vector machine are studied.The details are as follows.1.ADMM has become an effective method to solve large-scale struc-tural optimization problems.Although there are many studies on the con-vergence of the ADMM algorithm,the quantitative expression of the im-pact of the algorithm parameters on the convergence still needs to be further studied.It is only in the experiment that the step size is selected empirical-ly.In this paper,we study the convergence of Lasso which is an important problem of regularized minimization.We find that the form of the solu-tion can be expressed by soft threshold operator,and the three cases of soft thresholds can be equivalently transformed into two cases of algorith-m convergence factor.The optimal step size is then solved by minimizing the convergence factor.The experimental results show that the conver-gence rate of the proposed algorithm is faster than that of other steps.In addition,when the method is applied to the problem of compressive sens-ing,an approximate strategy for calculating the optimal step size is given,and a better experimental result is obtained.2.It is well-known that support vector machine?SVM?is an effec-tive learning algorithm in data processing.This paper proposes a nov-el distributed SVM algorithm in master-slave mode?MS-DSVM?,which integrates a distributed SVM and ADMM acting in a master–slave con-figuration where the master node and slave nodes are connected,mean-ing the results can be broadcasted.The distributed SVM is regarded as a regularised optimisation problem and modelled as a series of convex optimisation sub-problems that are solved by ADMM.Additionally,the over-relaxation technique is utilised to accelerate the convergence rate of proposed MS-DSVM.Our theoretical analysis demonstrates that the pro-posed MS-DSVM has liner convergence,meaning it possesses the fastest convergence rate among existing standard distributed ADMM algorithms.Numerical examples demonstrate that the convergence and accuracy of the proposed MS-DSVM are superior to those of existing methods under the ADMM framework.3.In many distributed learning tasks data arrive sequentially,and possibly asynchronously.Traditional batch processing algorithms can not meet the requirements.Therefore,we propose an online distributed SVM algorithm based on the ADMM algorithm.By introducing time parameters to the data matrix,we make some modifications to the MS-DSVM algo-rithm.Data inflow changes over time,and there is no limit to the way into the node,that is,any node can accept any amount of data at any time,the experiment demonstrates that the online algorithm can track changes in the training sets,and there is a good classification of real-time data.
Keywords/Search Tags:Alternating direction method of multipliers, Optimal step size, Support vector machine, Distributed algorithm, Online algorithm
PDF Full Text Request
Related items