| With the development of information technology services and the increasing of information,recommender systems to help users filter information have been paid more and more attention.Recommender systems need to use a large amount of user data,and the most commonly used is historical behavior data.Most previous recommender systems utilize single user-item interaction only,suffering from serious data sparsity problem.Multi-behavior recommendation,which attracts more and more attention,leverages multiple types of user-item interactions.Early methods of multi-behavior recommendation can not well model the relations between different behaviors.Moreover,the introduction of multi-behavior information improves the sampling deviation of negative sampling and the complexity of non-sampling methods.Aiming at the problem of relations modeling between behaviors,this thesis proposes a tensor decomposition and non-sampling learning based method,named nonsampling heterogeneous block term decomposition(NHBTD).NHBTD obtains more expressive user,item and behavior embeddings through block term decomposition,designs heterogeneous block terms to learn relations,controls the output of terms through weight vectors,and uses multi-task learning to adjust the weight of different behavior data.In addition,NHBTD learns parameters from whole data without sampling.Experiments on five datasets prove the necessity of introducing multi-behavior data,and NHBTD can better capture relations,achieve excellent recommendation results,and help to alleviate the data sparsity problem.Aiming at the high complexity of block term decomposition and non-sampling strategy,by sharing user and item embeddings between different terms and extending a loss function optimization method to tensor decomposition,this thesis further proposes a method based on NHBTD,named efficient non-sampling heterogeneous block term decomposition(ENHBTD).In addition,the thesis explores the use of attention mechanism to replace the weight vector.Experiments on three datasets show that ENHBTD reduces parameters and training time without damaging the performance.The introduction of attention mechanism helps to further improve the recommendation results. |