Font Size: a A A

Some Researches About Memory Gradient Method For Nonlinear Optimization Problem

Posted on:2019-04-17Degree:MasterType:Thesis
Country:ChinaCandidate:J Y LiFull Text:PDF
GTID:2370330545993599Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis,we presented a super memory gradient method for solving unconstrained optimization problems and a derivative-free memory gradient method for solving monotone nonlinear equations.They are denoted by Algorithm? and Algorithm? respectively.In Algorithm?,we modified the non-monotone line search technique,the search direction at each iteration will always provide a sufficient descent step,so the Algorithm? is more stable;In Algorithm?,we propose an improved line search technique,which reduces the selection of initial points,so Algorithm? more for solving large-scale equations.Under appropriate conditions,the two algorithm has global convergence and R-linearly converges to x*.The numerical results show that they are was effective.
Keywords/Search Tags:Unconstraiend optimization, Non-monotoe technique, Memory gradient method, Derivative-free method, Global convergence, Local convergence rate, Numerical experiments
PDF Full Text Request
Related items