Font Size: a A A

Optimization Algorithm Based On Second-order Information

Posted on:2021-05-23Degree:MasterType:Thesis
Country:ChinaCandidate:K G LiuFull Text:PDF
GTID:2370330620468269Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The second-order optimization algorithms are generally developed from Newton's method,which is a powerful tool for solving unconstrained optimization problems.Because it uses the curvature information of the objective function,compared with the firstorder optimization method,it can show better robustness and faster convergence.However,the second-order optimization algorithm needs to solve the Hessian matrix and the corresponding inverse at each iteration.Once it encounters a high-dimensional situation,this is a large calculation overhead and cannot even be performed.Quasi-Newton method,limit memory quasi-Newton method,and sub-sampling Newton method are all second-order methods based on Hessian matrix ”reform”.This dissertation first reviews several methods for solving unconstrained optimization,focusing on the current development of Newton's method and quasi-Newton's method.Then a quasi-Newton method with a correction of the scaling coefficient is proposed.This scaling factor combines the standard quasi-Newton equation and the idea of Yuan's new quasi-Newton equation,and proves the convergence properties of the new BFGS algorithm under convex conditions.Based on the main idea of the new BFGS algorithm,we give a conservative modification,which can achieve convergence under non-convex conditions.Next according to the newly modified BFGS algorithm proposed by Chu etal.Considering its special form,and generalizing it to the framework of limited memory,reducing the storage size of the problem can prove its convergence.Based on the traditional sub-sampling Newton method,we consider the combination of uniform sampling and non-uniform sampling.Sampling the gradient uniformly and non-uniformly sampling the Hessian matrix avoids processing all samples and is very suitable for ”big data” situations.Finally,numerical experiments are carried out on the proposed new algorithm.Compared with the standard method,the numerical results of our algorithm are more stable and have better practicability.
Keywords/Search Tags:unconstrained optimization, quasi-Newton method, limited memory quasi-Newton method, sub-sampling Newton method, global convergence
PDF Full Text Request
Related items