Font Size: a A A

Learning Trajectory Deep Representation For Trajectory Similarity Computation

Posted on:2021-02-23Degree:MasterType:Thesis
Country:ChinaCandidate:Y S WangFull Text:PDF
GTID:2370330614470876Subject:Electronic and communication engineering
Abstract/Summary:PDF Full Text Request
Mobile devices equipped with GPS have been widely spread,so that obtaining and analysis of large-scale trajectory data have become available.Among all trajectory data mining tasks,trajectory similarity computation is fundamental,which can be further applied in mobility patten research,travel behavior analysis,popular area discovery in city and so on.Therefore the research on trajectory similarity computation is of great practical value.Trajectory is a continuous curve in spatial domain,which is usually represented by sequences composed of sample points.In reality non-uniform or low sampling rates and data noise are common problems existing in trajectory data,traditional methods based on pairwise matching of sample points are not able to compute trajectory similarity accurately.Luckily,deep learning methods bring hope to the solving of the challenge,it can transform the input into a low-dimension vectors with fixed length,during which the features of the data can be learned,and trajectory similarity computation can be more accurate using the learned vectors.While existing methods tend to build encoder-decoder model based on Recurrent Neural Network(RNN),which has limited horizon while solving data,and it is not able to remember information far away from current moment.Therefore,a new model which can extract features in trajectories more accurately is needed to be researched.We combine the models used in natural language processing area with the solving of trajectory representation problem by proposing trajectory deep representation learning models named Transformer-traj and BERT-traj,which are based on multi-head attention mechanism,and they can function as a powerful tool in trajectory similarity computation.The contribution of the dissertation can be summarized as follow.(1)We introduce multi-head attention mechanism into the trajectory deep representation learning model,in substitute for models based on recurrent neural network.The attention mechanism can capture long-distance features in trajectories and integrate features of different scales,which ensures the model a more powerful trajectory representation capability,and in addition,the model based on attention mechanism can run in parallel.(2)We propose the trajectory deep representation model Transformer-traj and BERTtraj.From the perspective of structure,the former serves as the foundation for the latter;and as for the performance the former achieves higher accuracy and more powerful feature extraction capability.And models are trained by inputting data preprocessed by down sampling and noise adding,so that our models are robust enough to obtain accurate trajectory representation in the case of non-uniform and low sampling rates,as well as noise interference.(3)We conduct extensive experiments on real-world dataset,which proves the capability of BERT-traj in obtaining accurate trajectory representation.There are three evaluation methods utilized to assess the models,which are self-similarity,crosssimilarity,and KNN query of trajectory respectively,and the results show that BERT-traj achieves better performance than baseline model.Particularly,cross similarity considers both the distance between similar trajectories and distance between different trajectories,it can be considered a more comprehensive evaluation measure.Using the evaluation method on small dataset,the accuracy of BERT-traj is higher than that of baseline model by 23.26% and 14.06% when distorting and down sampling are implemented,respectively;while on a larger dataset the accuracy of BERT-traj is higher than that of baseline model by 23.71% and 22.42%,respectively.26 figures,12 tables,and 48 reference articles are contained in the dissertation.
Keywords/Search Tags:Deep learning, Trajectory similarity computation, Attention mechanism
PDF Full Text Request
Related items