Font Size: a A A

A Multi-source Domain Transfer Learning Method With Dictionary Learning And Its Parallel Implementation

Posted on:2021-01-29Degree:MasterType:Thesis
Country:ChinaCandidate:Y H LiangFull Text:PDF
GTID:2428330611467567Subject:Computer technology
Abstract/Summary:PDF Full Text Request
Transfer learning(TL)is a kind of machine learning methods.It exploits data from more than one domains and transfers the knowledges learned from the source task into the target task learning process.TL is widely used in computer vision and natural language processing.Besides,as a machine learning method,TL methods are also utilized in Cyber Physical Systems(CPS).In this paper,based on multi-task support vector machine(multi-task SVM)and dictionary learning method,we propose a method called Multi-source domain transfer learning method fusing dictionary learning(DMSTL).This method considers the case of multi-source transfer learning,and introduces the dictionary learning into transfer learning.In the proposed DMSTL method,a dictionary learning method is used to learn a discriminative sparse representation.A multi-task SVM is used for transfer knowledge from multi-source transfer leaning so as to improve the performance of target task.In addition,we also propose a parallel computing method on the basis of alternate convex search to enhance the real-time performance which yields a Parallel multi-source domain transfer learning method fusing dictionary learning(PDMSTL).The parallel optimization method is used to optimize the parameters in the PDMSTL method,such that we can obtain an optimal PDMSTL.The contributions of this paper are composed of the following three folds:Firstly,we consider the case of multi-source case transfer learning and we propose a Multi-source domain transfer learning method fusing dictionary learning(DMSTL).In the proposed DMSTL method,we use the dictionary learning method to determine the relationship among samples from multi-source.Besides,taking the advantage of the dictionary learning method,we obtain a more discriminative sparse representation with respect to each sample for learning in each domain,so as to improve the performance of tasks in all domains.Secondly,we introduce the multi-task SVM into the proposed method.We transfer knowledge among domains by shared parameters to enhance the performance of task in the target domain.Beside,we also introduce domain-specific parameters into tasks in all domains,which is consistent with the shared parameter to determine the relationship among sample in the same domain.Thirdly,considering the character of CPS,we proposed a Parallel multi-source domain transfer learning method fusing dictionary learning(PDMSTL).Based on the alternate convex search method,we put forward with a parallel optimization method with corresponding convergence analysis.The convergence analysis manifests that the proposed parallel optimization guarantees the convergence.Further,we can conclude that the proposed PDMSTL method is a reasonable method.Later,we conduct extensive experiments to evaluate the performance of proposed DMSTL and PDMSTL methods.Based on famous image,text and CPS data set,we implement the proposed methods and compare them with the existing TL methods.In addition,we also male use of Friedmen test and Bonferroni-Dunn test method to analyze the efficiency of proposed methods in a statistical view.The result manifests that the DMSTL and PDMSTL is able to outperform the existing methods.
Keywords/Search Tags:Transfer learning, Dictionary learning, Parallel computing
PDF Full Text Request
Related items