Font Size: a A A

Research On Derivative-free Global Optimization Algorithm

Posted on:2018-06-16Degree:MasterType:Thesis
Country:ChinaCandidate:E T LiuFull Text:PDF
GTID:2310330542952544Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Global optimization is an important branch of optimization theory.From the perspective of algorithm construction,global algorithms can be classified into deterministic optimization algorithms and stochastic optimization algorithms.Filled function method,tunneling function method and so on are deterministic optimization algorithms.Genetic algorithm is a typical stochastic algorithm,and multi-start clustering global optimization algorithm(GLOBAL)is a stochastic global optimization algorithm,through the global optimization phase and the local optimization phase,which get the global optimal result.The algorithm is weak to the objective function,and it can guarantee the convergence with relatively weak statistical parameters and can effectively solve the problem that the derivative information is unavailable.Thus,multi-start clustering global optimization algorithm has obvious advantage for the complex global optimization problems that cannot get an explicit objective functions in the engineering.The selection of feasible sample points and the selection of the local optimization method are the core problems of the multi-start clustering global optimization algorithm.Therefore,how to improve global optimization phase and local optimization phase has been the research point of domestic and foreign scholars.In this paper,the basic principle and process of multi-start clustering global optimization algorithm are studied,and the shortcomings of the derivative-free problem is improved by further study.The main work is summarized as follows:In the global phase,a hybrid selection strategy of sample points is proposed to improve the original strategy.The new strategy takes the distance between the sample points and the function value information into account.In addition,a new local non-derivative method based on simple gradient is proposed to improve the local phase.In order to further reduce the computational cost,a method of describing the necessity of minima estimation is constructed in the line search process of the new algorithm.Finally,the convergence is proved.Experimental results show that the new algorithm has great improvement in efficiency and robustness,especially for the "narrow valley" functions,the number of function value is reduced greatly.In addition,non-monotone technique has the advantage that it do not rely on local characteristics when dealing with complex nonlinear problem.It can jump out of the"valley" of the objective function.Based on a non-monotone derivative-free line search strategy,a new non-monotone derivative-free global optimization algorithm is proposed,which is based on the global phase of multi-start clustering global optimization algorithm and the line search.Numerical experiment shows that the new global optimization algorithm has improvement in terms of the number of function value,and can find the global optimal solution quickly.
Keywords/Search Tags:global optimization, derivative-free, non-monotone, quadratic interpolation, simplex gradient
PDF Full Text Request
Related items