| Human image segmentation technology is an essential component of the image processing industry,with great relevance and several applications.However,there are several issues with human image segmentation,such as improper segmentation of foreground and background borders and an inadequate amount of annotations on the human image dataset.To solve these concerns,a model is created that improves the accuracy of boundary segmentation and combines it with contrastive learning to provide high-precision results even when just a limited quantity of labeled images is used.The main research of this thesis is as follows.Traditional convolutional neural networks for image segmentation use standard convolutional products with fixed sensory fields,which are not well suited to the issue of segmenting irregularly shaped foreground targets in human images,leading to subpar border segmentation outcomes.This thesis enhances the U-Net model to address this issue by replacing the standard convolution in the downsampling stage with a deformable convolution with deformation capability and adding a self-attentation mechanisms module that can gather more contextual semantic information to lessen the semantic information lost in the U-Net model as a result of continuous downsampling.In recent years,contrastive learning applied to self-supervised representation learning has re-emerged,achieving state-of-the-art performance in the unsupervised training of deep image models.In this thesis,we extend the self-supervised contrastive learning method to the fully supervised domain,allowing us to make effective use of the labelling information and improve the segmentation accuracy well.The two different loss functions are summed to calculate the overall loss.In summary,this thesis proposes a deformable convolution-based image segmentation model combined with a non-local self-attentive mechanism to achieve accurate segmentation of human images.Better segmentation results are achieved by adding a fully supervised learning method with contrastive learning. |