| Under certain conditions,the phenomenon that noise is beneficial to signal or information processing in nonlinear systems is called stochastic resonance.This paper mainly studies the convolutional neural network based on common optimization algorithms and the stochastic resonance phenomenon in the network.The specific research contents are as follows:First of all,the five optimization algorithms of the gradient descent algorithm in the back propagation algorithm are studied from the four aspects of time consumption,initial learning rate,batch size number and convolution kernel number.The five optimization algorithms make a significant difference in the time consumption when the convolutional neural network achieves convergence.Adam optimization algorithm makes the network spend the shortest time to achieve convergence.Different initial learning rates,different number of batch sizes and different number of convolution kernels also have different degrees of influence on the training process of the convolutional neural network.Then a three-layer convolutional neural network structure is constructed,in which the back propagation algorithm uses the momentum gradient descent algorithm to update the parameters in the network,and Adam algorithm optimizes the momentum gradient descent algorithm.Under limited conditions,the Adam optimization algorithm can improve the performance of the convolutional neural network,and increasing the number of training samples reduces the gap in the cross entropy and the error rate between the two algorithms.Add gaussian noise to the output neurons of the Adam-optimized convolutional neural network,the simulation results show that the reduction percentage of cross entropy and the reduction percentage of error rate first decrease and then increase,that is,the phenomenon of stochastic resonance,but increasing the number of training samples reduces the effect of stochastic resonance.Finally,uniform noise is added to the output neurons of the convolutional neural network.The experimental results show that the stochastic resonance phenomenon occurs in the two measures of reduction percentage of cross entropy and the reduction percentage of error rate,but increasing the number of training samples reduces the effect of stochastic resonance. |