Font Size: a A A

Design And Application Of Global Fractional Order Gradient Descend Method

Posted on:2022-03-05Degree:MasterType:Thesis
Country:ChinaCandidate:Z G ZhuFull Text:PDF
GTID:2480306323479234Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
With the development of science and technology,the idea of "optimization" has gradually taken root in various fields.Many practical problems in science and engi-neering can be transformed into "optimization" problems through abstract description,such as mathematical modeling,system identification,machine learning and so on.Gradient descent method plays an important role in solving all kinds of unconstrained optimization problems because of its simple structure,intuitive principle and easy im-plementation.However,the traditional gradient method not only has slow convergence speed,but also often falls into local extremum when solving nonconvex optimization problems.In recent years,some scholars try to introduce fractional calculus into the design of gradient optimization algorithm in order to get better performance.However,the existing research is still in its infancy,the research on the properties of fractional gradient descent method is not deep enough,and the existing fractional gradient descen-t method is still unable to solve the nonconvex optimization problem well.Therefore,this dissertation is devoted to researching and analyzing the characteristics of fractional gradient descent method,and designing a fractional gradient descent method with fast convergence speed,high accuracy and global search ability,so as to enrich and improve the related work of design and research based on gradient optimization algorithm.Firstly,aiming at the problem that the original fractional gradient method can not converge to the first-order extremum,a fractional order gradient method with memo-ry step is proposed.By analyzing and comparing the characteristics of the fractional gradient method with different parameters,the recommended interval of parameter se-lection is given.The concept of fractional order tangent is given to illustrate the effect of different order on the convergence speed of the algorithm.Then,in order to make the algorithm jump out of the local extremum and improve the global search ability of the algorithm,a fractional order gradient method with dis-turbance is proposed.When the fractional gradient method falls into the extremum,the algorithm gives a random disturbance of appropriate amplitude,which makes it escape from the current region to search for other possible better solutions.Through the sim-ulation and analysis of some examples,it is verified that the algorithm has good global search ability in nonconvex problems.Finally,for large-scale,high-dimensional complex nonconvex optimization prob-lems,a two-stage fractional gradient method with higher global search efficiency is designed.By using particle swarm optimization algorithm to screen the initial iteration points,a considerable part of the calculation process of convergence to local extremum is saved.The convolution neural network is built for training comparison to show the high training accuracy and search efficiency of the two-stage fractional gradient method.
Keywords/Search Tags:fractional calculus, gradient descend method, random disturbance, particle swarm optimization algorithm, global optimization, artificial neural network
PDF Full Text Request
Related items