Font Size: a A A

Research On Contrastive Learning Method Based On Nearest Neighbor Optimization And Momentum Updat

Posted on:2024-03-15Degree:MasterType:Thesis
Country:ChinaCandidate:Q Y ChenFull Text:PDF
GTID:2568307106976199Subject:Electronic information
Abstract/Summary:PDF Full Text Request
Self-supervised learning mins its information from unlabeled data by exploiting pretext tasks,thus acquiring an encoder that is generally applicable to downstream tasks.Contrastive learning,as a method of self-supervised learning,has achieved great success.Contrastive learning optimizes the measuring the similarity between positive samples and negative sample,so that making positive samples attract each other and negative samples repel each other,thus improving the performance of the encoder on downstream tasks.Contrastive learning adopts a siamese network architecture and maintains two networks for representation learning.When the parameters of the two networks are not shared,the two networks are referred to as student network and teacher network.A good teacher network can better help the student network to learn,and a good student network can make better adjustments to the teacher network.Mutual learning between student network and teacher network is crucial to contrastive learning.Existing algorithms rely on various designs and ignore the role of negative samples,which limits the performance of mutual learning between two networks.This dissertation pays attention to the study on how to promote mutual learning between student network and teacher network,and performs the following two works.(1).For the problem that how to promote mutual learning between two networks to acquire the best teacher network,this dissertation proposes an algorithm of contrastive learning based on bilevel optimization of pseudo siamese networks(CLBO).The bilevel optimization strategy includes student network optimization strategy based on nearest neighbor optimization and teacher network optimization strategy based on stochastic gradient descent.The teacher network is regarded as a constraint term through the student network optimization strategy based on nearest neighbor optimization to help the student network learn better from the teacher network.The parameters are calculated by the teacher network optimization strategy based on stochastic gradient descent to update the teacher network.Experiments on 5 datasets show CLBO performs better than other algorithms in downstream tasks and verify that CLBO can acquire the best teacher network.(2).For the problem that how to promote mutual learning between two networks to acquire the best student network,this dissertation proposes an algorithm of self-supervised learning based on alternating contrastive learning and momentum update.By introducing the negative sample information into the teacher network of the positive samples contrastive learning stage,it can help the positive samples contrastive learning stage to acquire a better teacher network,thus helping to improve the performance of the positive samples contrastive learning stage.By exchanging contrast to strengthen the ability of the negative samples contrastive learning stage to attract the distance between the positive samples,thus helping to improve the performance of the negative samples contrastive learning stage.The two contrastive learning stages mutually promote to acquire the best student network in the positive samples contrastive learning stage.Experiments on 5 datasets show this algorithm performs better than other algorithms in downstream tasks and verify that this algorithm can acquire the best student network.
Keywords/Search Tags:Self-supervised Learning, Contrastive Learning, Siamese Networks, Nearest Neighbor Optimization, Momentum Update
PDF Full Text Request
Related items