| As a carrier of record information,the image plays an indispensable role in our daily lives.Image resolution is an indicator of the detail information contained in an image and the resolution.The resolution of an image determines the sharpness of the image and the details in the image.Therefore,obtaining high-resolution images has become an important research direction in the field of images.Due to the hardware of the image acquisition and other external factors,this has caused some interference in our acquisition of high resolution images.In order to improve the image resolution while reducing the hardware improvement cost,we can use software to improve the resolution of the image,which is one of the conditions for the emergence of Super-Resolution(SR).SR is a software method to improve image quality,and its cost-effective features make it stand out in the field of image processing.The main work of this paper is as follows:1)The system explains the background meaning of super-resolution technology.At the same time,the subjective and objective evaluation criteria of image quality are introduced in detail.A brief introduction to several typical algorithms that have emerged since the development of super-resolution technology,and the basic principles of each typical algorithm and their advantages and disadvantages.2)The basic principle,network components and network training process of convolution neural network are introduced in detail.The SRCNN algorithm and the improved algorithm based on SRCNN algorithm are introduced.3)This paper proposes a super-resolution reconstruction algorithm based on residual network.The improved algorithm is improved based on the traditional SRCNN network model.The residual network is added to the SRCNN three-layer model,which increases the depth of the SRCNN network model and makes it have a better reconstruction effect on the image.Finally,the improved algorithm is analyzed experimentally,and a better reconstruction effect can be obtained in the experimental results.Figure[22]table[3]reference[56]. |