Font Size: a A A

Research On Unconstrained Optimization Algorithm Without Derivatives Based On Subspace Method

Posted on:2019-02-05Degree:MasterType:Thesis
Country:ChinaCandidate:Y M ZhangFull Text:PDF
GTID:2310330545958278Subject:Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis,we study a subspace algorithm for solving unconstrained optimization problems.During each iteration,we construct a subspace,which can transform a high-dimensional unconstrained optimization problem into a low-dimensional subspace for solving.At present,most of the optimization methods rely on the derivative information of the problem.However,in practice,the derivative of many optimization problems is not easy to be obtained or even completely not available.Therefore,solving these problems requires using an optimization algorithm without derivatives.First of all,we introduce three kinds of interpolation models:the linear interpolation model,the quadratic interpolation model without crossover term,the complete quadratic interpolation model.Secondly,two ways of choosing the approximate Newton direction are introduced in this thesis.At last,in this thesis,three kinds of subspace optimization algorithms without derivatives are introduced:In the third chapter,two kinds of optimization algorithms without derivatives for two-dimensional subspace are given,that is,the two-dimensional subspace formed by the approximate gradient and the last iteration direction optimization algorithm without derivatives and the two-dimensional subspace optimization algorithm without derivatives from the approximate Newton direction and the last iteration direction,and gives the numerical test.The fourth chapter gives the optimization algorithm without derivatives of three-dimensional subspace composed of approximate gradient,the last iteration direction and approximate Newton direction,and the four-dimensional subspaces formed by approximate gradient,the last iteration direction,approximate Newton direction and a random direction optimization algorithm without derivatives,and gives numerical experiments.Firstly,The results of numerical experiments show that the optimization algorithm without derivatives in two-dimensional subspace made up of approximate Newton direction and the last iteration direction has fewer iterations than the optimization algorithm without derivatives in the two-dimensional subspace made up of the approximate gradient and the last iteration direction.Secondly,The optimization algorithm without derivatives in the three-dimensional subspace has a significant reduction in the number of iterations compared with the two-dimensional subspace optimization algorithm without derivative;the optimization algorithm without derivatives in the four-dimensional subspace also has fewer iterations than the three-dimensional subspace without derivative optimization algorithm,but not Significant.Thirdly,the three algorithms,based on the linear function interpolation subspace algorithm,the subspace algorithm based on the quadratic function interpolation without cross terms and the subspace algorithm based on the complete quadratic function interpolation,reduce sequentially in the number of iterations and the efficiency increases successively.In addition,in the subspace algorithm for the same dimension,we compare the approximate Newton direction obtained by the quadratic function interpolation method instead of the Newton direction and the quasi-Newton direction instead of the Newton direction.The former has fewer iteration times and higher efficiency.In this thesis,we compare the three subspace optimization algorithms without derivative with the existing algorithms without derivatives.Numerical experiments show that the optimization algorithm without derivatives based on subspace method is very effective.
Keywords/Search Tags:subspace method, optimization without derivatives, unconstrained optimization, numerical experiment
PDF Full Text Request
Related items