Font Size: a A A

The Influence Of The Structure Of Neural Networks On Function Approximation And Numerical Solutions Of Differential Equations

Posted on:2021-03-22Degree:MasterType:Thesis
Country:ChinaCandidate:X Z LvFull Text:PDF
GTID:2480306107959419Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Deep learning based on neural network has achieved remarkable results in many different fields,one of the important reasons is that neural network is a universal approximator.The universal approximation theorem lays a theoretical foundation for the function expression ability of neural networks,however,how to choose the appropriate activation function and how to decide the depth and width of neural network are the subjects that many scholars are studying.This paper introduces the concept of neural network and the algorithm of solving differential equation based on neural network,mainly studing the influence of activation function,network depth and width of BP neural network and optimization algorithms on function approximation and numerical solutions of differential equations.This paper mainly discusses the influence of relu function,sin function and sigmoid function as activation function on the expression power of neural network,proving the approximation principle of neural network based on different activation functions.With the help of the TensorFlow deep learning framework,the numerical examples are calculated and summarized.It is found from the numerical calculation results that the ReLU neural network has a poor fitting effect because of piecewise linear approximation and a large number of "neuron death" will occur.In the deep neural network,the Sigmoid neural network will appear very serious gradient disappearance phenomenon,while the Sin neural network can alleviate the gradient disappearance phenomenon and have better numerical results;When solving the partial differential equation,the accuracy of the numerical solution will be greatly improved by adjusting the output of the neural network so that the neural network can automatically meet the initial and boundary conditions;In this paper,we proves that a class of radial function 1(?Ax+b?<r),x ? Rn can be expressed by 2-hidden layer neural network,which cannot be expressed by single-hidden layer neural network whose width is less than O(ecn)(c is a constant)to illustrate the influence of depth and width on neural network.Numerical experiments also show that increasing depth is more beneficial to neural network learning than increasing width.And on the basis of Adam optimization method,we propose an"early stop" strategy based on the exponential average of error,which makes the value of loss function decrease rapidly and avoids the oscillation phenomenon caused by long step training.
Keywords/Search Tags:Neural Network, Activation function, Depth and Width, Optimization algorithms, Function approximation, Numerical solutions of differential equations
PDF Full Text Request
Related items