Font Size: a A A

The Algorithm And Convergence Analysis Of The Smooth Optimization Problem Of Separable Functions

Posted on:2021-01-24Degree:MasterType:Thesis
Country:ChinaCandidate:X H QianFull Text:PDF
GTID:2430330623984512Subject:Mathematics
Abstract/Summary:PDF Full Text Request
The descent method is one of the important methods for solving the unconstrained smooth optimization problems.In the case when the objective function is the sum of several smooth functions,the optimization is called the separable optimization.Some authors have proposed and studied the incremental gradient algorithm for solving such kind of problems.The incremental gradient method takes the negative gradient direction of one component function as iteration direction at a time,which can reduce the computational complexity of the algorithm.As far as we know,the results in the literature on the incremental gradient algorithm is based on the divergence step size rule and the constant step size.However,the incremental gradient employing the constant step sizes is limited to some special cases;and some numerical experiments show that the convergence efficiency of the incremental gradient algorithm employing the divergence step size rule is unsatisfactory.The present paper focuses on study of the effective numerical methods for solving the separable optimizations.First,the split gradient algorithm is proposed,which takes the negative gradient direction of one component function as iteration direction at per iteration if it is a descent direction.The convergence property of the algorithm was studied.Some numerical results show that the split gradient algorithm is more effective than the gradient algorithm,and stabler than the stochastic descent algorithm.Secondly,inspired by [Blatt D,Hero A,Gauchman H.SIAM J.Optim.,29-51,2007] and [Bertsekas D P,Tsitsikils J N.SIAM J.Optim.,627-642,2000],the incremental aggregated gradient algorithm employing the divergence step size rule and the scaled incremental gradient algorithm employing the divergence step size rule are proposed together with the convergence results,respectively.At last,some numerical experiments illustrate the convergence properties,which also show in general,that the scaled incremental gradient algorithm is more effective than the incremental aggregated gradient algorithm,and the latter is more effective than the unscaled one proposed in [Bertsekas D P,Tsitsikils J N.SIAM J.Optim.,627-642,2000].
Keywords/Search Tags:Unconstrained smooth optimization, Split gradient algorithm, Descent algorithm, Incremental gradient algorithms, Incremental aggregated gradient algorithm, Armijo step size, Divergence step size
PDF Full Text Request
Related items