Font Size: a A A

Convergence Of Sub-gradient Algorithm With Dynamic Step Size

Posted on:2020-09-19Degree:MasterType:Thesis
Country:ChinaCandidate:T T ZhaoFull Text:PDF
GTID:2430330596473002Subject:Mathematics
Abstract/Summary:PDF Full Text Request
The subgradient algorithm is one of the classical and important algorithms to solve the large-scale convex optimization problems,and it is well known that the convergence of the algorithm depends heavily on the choice of the step sizes.The main work of this paper is to modify parameters of the subgradient algorithm with the dynamic step sizes proposed in[Math.Program.,1999,85?1?:207-211],where two kinds of parameters are used in the algorithm:1and.We propose two new kinds of the dynamic step sizes for the subgradient algorithm and establish the convergence of the algorithms,respectively.More precisely,the first one uses two dynamic parameters{7)}and{8)}(the parameters8)is a constant in[Math.Program.,1999,85?1?:207-211]);The second one only uses one kind of dynamic{7)}?another param-etersis reduced?and the update rule of{7)}is different from the corresponding in[Math.Program.,1999,85?1?:207-211].Finally,some numerical experiments are provided to show that the new algorithms are more effective than the prior one.
Keywords/Search Tags:convex optimization, subgradient method, constant step size, diminishing step size, divergence step size, Polyak step size, dynamic step size
PDF Full Text Request
Related items