Font Size: a A A

Research On Image Super-Resolution Network Based On Attention Mechanism

Posted on:2023-10-21Degree:MasterType:Thesis
Country:ChinaCandidate:Q CaiFull Text:PDF
GTID:2558307061961909Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
In recent years,depth learning method has been successfully applied to the super-resolution task of a single image.Using depth convolution neural network can significantly improve the superresolution performance of a single image.However,most models based on deep convolution neural network lack the ability to distinguish different types of information,and treat them equally,which leads to the limitation of the representation ability of the model.The proposal of attention mechanism can just solve the defects of equal processing of network information from different layers,different channels and different spatial locations.Therefore,this paper focuses on the attention mechanism to build a super-resolution model,and has achieved remarkable results under the comprehensive measurement of model performance and parameter number in the test data set.The specific work of this paper is as follows:Firstly,the background and significance of super-resolution technology research are introduced.By analyzing the development and innovation of super-resolution technology in recent years,the breakthrough models of great significance are introduced,including residual network,dense connection network,attention mechanism network,generation countermeasure network and so on.Secondly,the theoretical basis of super-resolution technology is discussed in detail,the theoretical model of super-resolution is defined by specific mathematical formulas,and four mainstream super-resolution frameworks are introduced.The structure design of classical convolution neural network is introduced,and the loss function and image measurement index are also summarized in detail.This paper introduces several classical super-resolution convolutional neural networks,and analyzes the advantages and disadvantages of specific network models from the perspective of overall network structure.Then it studies some basic attention mechanisms,writes its mathematical principle in detail,and shows its internal specific mathematical operation flow with a block diagram.Finally,some data sets commonly used in super-resolution and their remarkable characteristics are listed.Then a multi-scale attention block is proposed,which is improved on the ordinary residual block.Considering that multi-scale convolution can extract features of different scales,3x3 and 5x5 convolution kernels of different sizes are introduced into the residual block to adaptively detect image features at different scales.In addition,by allowing these features to pass through two parallel branches of attention modules(channel attention module and spatial attention module),we explore the relationship between different channels and different positions in each channel,and focus the attention of the network on the channels and spatial positions with more important feature information.Based on the proposed multi-scale attention block,a cascaded multi-scale attention residual network is constructed by using a simple global and local online system.The cascade on-line system can make the low-frequency information of the previous multi-scale attention block / group flow better to each multi-scale attention block / group behind it,so that the network can learn multi-layer feature representation and reconstruct the texture details and context content of the image at the same time.The model also adopts a global residual learning structure,which can not only avoid the disappearance of the model training gradient,but also learn the residual information in a simple jump way.Finally,a multi-scale global attention residual network is proposed.Considering that the previous network treats the feature information from different layers equally,the inter layer attention mechanism is introduced into the adaptive hierarchical features of the network,and the correlation between different layers is considered.At the same time,the network structure design also adopts an improved network structure with residual in the residual,that is,there are stacked multi-scale attention blocks in the inner residual,Then,dense connections are used to make full use of the characteristics of each attention block,and short hop connections are used to allow the network to learn residual information; The external residual is an attention group composed of multi-scale attention blocks.Inter layer attention is applied between different groups to develop the characteristic relationship between different attention groups,so that the network can focus on the more contributing attention group.Finally,the long jump connection is also used to learn the residual information,and speed up the convergence of model training.The experimental results show that the proposed cascade multi-scale attention residual network and multi-scale overall attention residual network have achieved remarkable results under the comprehensive measurement of model performance and parameter number.
Keywords/Search Tags:Multi-scale, Channel Attention, Spatial Attention, Layer Attention, Cascade, Residual in Residuals
PDF Full Text Request
Related items