Font Size: a A A

The Construction And Approximation Of Two Classes Of Neural Networks Operators

Posted on:2021-09-09Degree:MasterType:Thesis
Country:ChinaCandidate:L P ChangFull Text:PDF
GTID:2480306308484874Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The problem of neural network construction and approximation is one of the hotspots and difficulties in neural network theory and application.In function approximation theory,many methods often use ingenious ways to construct operators,which aim to make it have a good convergence speed for the approximation of function.In this thesis,a parameter 0 < ? < 1is introduced in the construction of operators,which take an important role balance the approximation error order.This thesis mainly studies the construction and approximation of two types of feed-forward neural networks with activation function translated by sigmoidal function.The continuous modules of function is used as measure to estimate approximation errors to the objective function.The contexts of the thesis are arranged as follows:1.A neural network operator utilized the hyperbolic tangent function is constructed to approximate a univariate function.Firstly,a type of bell-shaped function is constructed by the appropriate translation and combination of the hyperbolic tangent function.Then,the parameter 0 < ? < 1is introduced,and the constructed function is used as the activation function to define a class of neural network operators.Finally,a Jackson-type approximation theorem is established,where the analytical techniques and function approximation methods are used to estimate the error of this type to approximate continuous function.2.We construct a multi-input neural network operator with bell-shaped activation function translated by hyperbolic tangent function and study its approximation to multivariate functions.Firstly,0 < ? < 1 is introduced to construct the multi-input neural network operators.Then,using the modulus of the multivariate continuous function as a measure,the approximation error of the network operators approximating the multivariate continuous function is estimated,where the parameter ? plays a important role in the balance of the error orders.3.Construction and approximation of neural network operators with B-spline functions activation functions.Firstly,we construct a class of bell-shaped functions defined on R by using a one-dimensional s-order Bspline function.Then,the parameter 0 < ? < 1 is introduced,and a bell operator is used as the activation function to construct a network operator to approximate the continuous function defined on the bounded closed interval.Finally,the approximation error estimation theorem is established by using continuous modules and generalized absolute moments as measures.
Keywords/Search Tags:Neural network, Approximation, Error estimation, Hyperbolic tangent function, B-spline function
PDF Full Text Request
Related items