Font Size: a A A

Study On Different Types Of Noise Benefit Of Convolutional Neural Networks Under ReLU Activation Function

Posted on:2024-02-08Degree:MasterType:Thesis
Country:ChinaCandidate:Y LiuFull Text:PDF
GTID:2558307136497444Subject:Applied statistics
Abstract/Summary:PDF Full Text Request
With the rapid development of computer technology and artificial neural network theory,convolutional neural networks have been widely used in image classification,object detection and other fields.As a representative model of artificial neural network,convolutional neural network is a kind of nonlinear system,and stochastic resonance theory points out that adding appropriate noise in nonlinear system can improve the performance of the system,so stochastic resonance-assisted optimization of the performance of convolutional neural network is a problem worthy of in-depth study.The current stochastic resonance research of convolutional neural networks mainly focuses on traditional convolutional network models,and this thesis compares the influence of different activation functions on convolutional neural networks and studies the random resonance phenomenon in convolutional neural networks under different noises.The main research contents of the thesis are as follows:(1)The performance effects of different activation functions on convolutional neural networks are compared.Firstly,the properties of the Sigmoid,Tanh,and ReLU activation functions are described,and the impact on convolutional neural network training is studied.Simulations are then performed on two experimental datasets,MNIST and Fashion-MNIST.Experimental results show that the model using the ReLU function has better performance in cross-entropy and classification accuracy,and has advantages in improving the generalization performance of the model.The different activation functions in the convolutional neural network are compared and studied,and the principle performance of the activation functions of Sigmoid,Tanh and ReLU and their influence on the training of convolutional neural network are analyzed,and then two experimental datasets of MNIST and Fashion-MNIST are introduced,and finally the simulation is carried out on the two datasets,and the experimental results show that compared with the other two traditional activation functions,the model using the ReLU function has better performance in cross-entropy and classification accuracy at the same time,it is found that the ReLU function has advantages in improving the generalization performance and average training time of the model,which proves that using different activation functions also has different effects on neural networks.(2)An improved noise-enhanced convolutional neural network(NCNN)algorithm is proposed.With the ReLU function as the activation function,the noise benefit of the NCNN algorithm is improved by adding uniform noise to the output layer and updating the model parameters by using the minibatch stochastic gradient descent algorithm.Finally,simulations are carried out on MNIST and Fashion-MNIST datasets,and the experimental results show that the cross-entropy and classification error rate of the proposed model is lower than that of the noise-free model under insufficient training,and there is noise benefit.(3)The uniform noise added to the algorithm is extended to Gaussian noise,Laplace noise and Cauchy noise,and the influence of different noise types on the noise benefit of convolutional neural networks is studied.The experimental results show that under the condition of insufficient training,the noise benefit of the NCNN models of the three non-uniform noise is present,and the noise benefit of Cauchy noise is lower than that of the other three types of noise under the same initial noise intensity.
Keywords/Search Tags:Convolutional neural networks, Activation function, Cross entropy, Image classification, Stochastic resonance
PDF Full Text Request
Related items