Font Size: a A A

Generative Adversarial Networks Based On Relaxed Wasserstein Divergence—Alogorithm And Implementation

Posted on:2021-03-19Degree:MasterType:Thesis
Country:ChinaCandidate:Q ZhangFull Text:PDF
GTID:2517306503991539Subject:Applied Statistics
Abstract/Summary:PDF Full Text Request
Generative Adversarial Nets(GANs)are a kind of deep neural network architectures,which is a research hotspot in the field of artificial intelligence.Along with the development of deep learning,researchers pay more atten-tion to the research of generative models.The generative models simulate the real world,learn the statistical patterns from data and generate similar samples.This process involves a lot of prior knowledge and a large amount of computation,so the development of generative models is relatively slow,compared with discriminative models.GANs provide a new thought for the research on generative models,which have gained extensive attention from both academia and industry.WGANs improve original GANs by introducing a new loss function,which effectively measuring the discrepancy on the distribution of the gen-erated and real data.This function is an indicator of the GANs training effi-ciency,and helps to avoid the problem of training failure caused by gradient disappearance.On the basis of WGANs,RWGANs further extend the loss function and relax the requirement that the cost function must be symmetric in WGANs.In the framework of RWGANs,we propose a new lower bound function to approximate the original loss function and suggests a new RWGANs algo-rithm.The new algorithm solve the training problem of conjugate network in the original RWGANs algorithm,and have a similar form with WGANs.It can be regarded as a function transformation before the training of WGANs,and different choices of functions impact on the efficiency of training.In this paper,comparison experiments on MNIST,Fashion-MNIST and CI-FAR data sets are conducted to verify our hypothesis.In summary,the new algorithm can make the GANs training be more stable and converge faster without reducing the image quality.
Keywords/Search Tags:GANs, Image generation, Wasserstein distance, Bregman divergence
PDF Full Text Request
Related items