Font Size: a A A

Research On Deep Neural Network Pruning Method Based On Mask Learnin

Posted on:2023-08-22Degree:MasterType:Thesis
Country:ChinaCandidate:L LuFull Text:PDF
GTID:2568306758965569Subject:Electronic information
Abstract/Summary:PDF Full Text Request
Traditional deep neural network pruning algorithms are usually used to eliminate the redundant structures in the network to lighten the neural network.However,some current studies have found that the sparse structure in an over-parameterized random initialized neural network has good performance itself,and some pruning algorithms are designed to find the sparse structure hidden in the over-parameterized neural network.These studies show that pruning can also be used to find sparse structures in over-parameterized neural networks.In this paper,we focus on a mask-based learning pruning algorithm for neural networks to find sparse sub-networks in over-parameterized neural networks by learning masks.In this paper,two specific works are carried out as follows:1).A random initialization neural network pruning method based on sparse binary programming is proposed: inspired by the excellent performance of mask-based learning pruning algorithms such as Edge-popup in recent years,a mask-based learning pruning algorithm based on sparse binary programming is proposed.The algorithm models the pruning training process as a sparse binary constrained optimization problem.The core idea is to use sparse binary programming to learn a binary mask,using which an untrained but well-performing sparse network can be pruned from a randomly initialized neural network.In comparison with related algorithms such as Edge-popup,the sparse network found by this algorithm has better classification generalization performance at multiple sparsity.2).A mask learning pruning method based on few-shot meta-learning is proposed: based on work(1),this method combines a pruning algorithm based on mask learning with meta-learning in order to make the pruned sparse network have good generalization accuracy even in the face of few-shot classification tasks.Work(1)learns the mask for pruning based on sparse binary planning with random initialized importance scores,and this algorithm learns the importance scores as meta-parameters and uses Reptile algorithm parameter optimization.Based on the learned importance scores of meta-parameters,the algorithm performs sparse binary planning and learns binary masks to complete pruning.The idea of transfer learning is also borrowed and the weights trained in another dataset are used as the initialization weights of the network.The experimental results demonstrate that this algorithm can still prune sparse networks with good performance from a large network when facing a classification task with few samples.
Keywords/Search Tags:Neural network pruning, Meta-learning, Binary mask, Mask learning
PDF Full Text Request
Related items