| Gradient method is a class of first order optimization methods,which are simple and easy implemented.Stepsize plays an important role in gradient method.It influences both numerical performance and convergence property.This thesis mainly considers gradient method for unconstrained optimization problems.Smooth and non-smooth objective functions are discussed.Two different gradient methods are proposed for smooth and non-smooth problems,respectively.First,general smooth unconstrained optimization problems are considered in the second chapter.We construct an approximated exact line search step by using the equivalent relationship between the exact line search step and the BB1 step for the convex quadratic minimization problems,then we extend the cyclic gradient methods for convex quadratic function minimization to solve the general smooth optimization problems.Combining with nonmonotonic line search technique in each iteration,we establish its global convergence.Furthermore,we prove that the proposed gradient methods have sublinear convergence for general convex problems,and R-linear convergence rate for strongly convex problems.Numerical results show that the proposed cyclic gradient methods outperform the state of the arts.In the third chapter,we consider a kind of optimization problem with special structure—convex quadratic function plus L1 terms minimization(LASSO)problem.We extend the smooth cyclic gradient method for convex quadratic optimization problems to solve LASSO problems.We also construct an approximated exact line search step by using smooth techniques to approximate non-smooth term.We prove that the extended method converges R-linearly under certain conditions.Numerical experiments show the effectiveness of the new algorithm. |