Font Size: a A A

A Limited Memory Quasi-newton Method For Large-scale Optimization Problems

Posted on:2014-09-22Degree:MasterType:Thesis
Country:ChinaCandidate:H Y JinFull Text:PDF
GTID:2250330425461013Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Quasi-Newton methods are effective methods for solving unconstrainedoptimization problems. However, some counter examples show that the quasi-Newton methods need not converge globally for non-convex minimizationproblems. In order to overcome this drawback, many authors proposed somemodifications of the quasi-Newton methods, such as the cautious BFGS (CBFGS)method, the modified BFGS (MBFGS) method, etc. But they still existed someshortcomings in these modifications. In order to solve large-scale optimizationproblems, many authors proposed limited memory quasi-Newton methods byimproving the quasi-Newton methods. But they did not work in solving ill-conditioned problems. Therefore, several authors proposed a regularizationstrategy in limited memory methods to improve numerical results. But in theupdate formula, the initial matrix usually only took a constant matrix and lostsome useful iteration information. In this thesis, based on the quasi-Newtonmethods and the limited memory quasi-Newton methods, we propose amodification of the BFGS method with a scaling factor (SBFGS) and its limitedmemory version (L-SBFGS). Theoretical analysis and numerical tests show thatthe proposed methods are effective and stability.In this thesis, firstly we introduce the methods for solving unconstrainedoptimization problems, focus on the situation and development of the BFGSmethod and the limited memory BFGS method, and propose the workingassumption.Secondly, based on the cautious BFGS method, we propose a modifiedmethod with a scale factor (SBFGS) for solving non-convex unconstrainedoptimization problems. Compared with CBFGS method, quasi-Newton matricesgenerated by the SBFGS method can be more effectively updated as iterationsproceeds, and thus can approximate the Hessian matrix. And we prove that theSBFGS method with either a Wolfe-type or an Armijo-type line searchconverges globally for solving non-convex unconstrained problems.Thirdly, based on the compact formula proposed by Byrd and Nocedal, weextend the proposed SBFGS method to the limited memory framework and thenpresent a compact limited memory method for large-scale unconstrainedoptimization problems. Especially we consider the selection skill of the initial matrix with iterative information. We prove that the SBFGS method witheither a Wolfe-type or an Armijo-type line search converges globally, and wealso establish the R-linear convergence of the L-BFGS method.Finally, we perform the poposed methods and compare theirs performanceswith existing similar methods. The numerical results show that the proposedSBFGS method and L-SBFGS method are more effective.
Keywords/Search Tags:large-scale optimization, limited memory quasi-Newton method, non-convex minimization, global convergence, compact formula
PDF Full Text Request
Related items