Font Size: a A A

Research On Pruning Algorithms For Neural Network Image Classification Tasks

Posted on:2024-09-30Degree:MasterType:Thesis
Country:ChinaCandidate:Z B ZhuFull Text:PDF
GTID:2568306932455614Subject:Cyberspace security
Abstract/Summary:PDF Full Text Request
The rapid development of Convolutional Neural Networks(CNNs)has led to significant progress in machine vision work.In recent years,as the size and complexity of CNNs have grown,many researchers have begun to focus on reducing the size and complexity of models while maintaining high performance.Model parameter pruning can solve this problem well.Model pruning is a machine-learning technique.Its primary function is to reduce the model size without reducing model performance,achieve model sparsification,enhance the robustness of the model.Many pruning works use magnitude-based information to introduce sparsity into network models for more efficient and energy-efficient models.More and more attention has been paid to the work of model compression to cover the whole scene of artificial intelligence tasks.Existing model pruning methods generally use regularization methods to guide the pruning of parameters and rigidly subtract parameters that are less than the threshold.The optimization will change the distribution of the parameters,and the model parameters need to satisfy a specific distribution to obtain a better network effect after compression.It is worth mentioning that when pruning the model,only single-layer information is used to determine whether the parameters are pruned.The lack of assistance from upper and lower-layer information makes the pruning method imperfect.To make the pruning method more grounded and connect the information of each model layer for pruning,this dissertation proposes two pruning methods BNP and GAP.Given the problem that model parameters need to satisfy a specific distribution and the model pruning algorithm only considers single-layer network information,BNP combines the filter that generates the channel and the channel’s information in the channel process pruning.There is no need to constrain the distribution of network parameters,the importance of pruning is guided by integrating information from the CB(Convolutional and BatchNorm)module,and the batch normalization(BatchNorm,BN)layer is used to analyze the distribution similarity between features to Achieve better-structured pruning.Compared with traditional methods based on regularization rules,this algorithm performs better.Through comprehensive evaluation,BNP has been demonstrated in modern CNNs on CIFAR-10 and ImageNet datasets,achieving excellent network model accuracy in image classification tasks with guaranteed pruning ratios.Given the general introduction of regularization in pruning methods and the need for an overall consideration of the network structure when pruning only based on singlelayer information,GAP introduces a new attention mechanism.It judges the channel’s contribution to the network effect through the channel score generated by the attention mechanism and screens important channels in the global network structure.According to the gradient size,parameter size,and the characteristics of the last round of pruning,the score is scored,the channels with lower scores are cut,and the channels retained by the pruning algorithm are dynamically adjusted according to the training feedback.Let the models interact with each other in pruning so that the pruned model can bring better accuracy results.In experiments,the GAP implementation balances performance and efficiency compared to conventional methods.
Keywords/Search Tags:Convolutional neural network, Structured pruning, Model compression, Information fusion
PDF Full Text Request
Related items