Font Size: a A A

Applications Of Neural Networks To Multivariate Time Series Problems

Posted on:2021-03-27Degree:MasterType:Thesis
Country:ChinaCandidate:R ZhangFull Text:PDF
GTID:2512306302476134Subject:Financial Information Engineering
Abstract/Summary:PDF Full Text Request
Time series data occupies a large proportion in all walks of life.Through the prediction of time series data,people can make reasonable projections for the future to help decision-making or early warning.Therefore,research on time series data has always been a very hot topic.With the development of big data,nowadays people face more and more multivariate time series data which are composed of many related sequences from the same field.Besides the time dependence like traditional univariate time series,these multivariate time series may also have very close relationships between themselves.In this way,it has been the focus of people's research to model complex and nonlinear dependencies between time and series at the same time.In this paper,we propose a novel neural network structure,namely Temporal Convolutional Network with Attention(TCNA),which is used to efficiently deal with multivariate time series prediction problems,especially the long-time multi-sequence complex problems.First,we use Temporal Convolutional Network(TCN)to model the time dependence of the data.We abandon the traditional Recurrent Neural Network(RNN)for time series problems,because the sequential iteration of the RNN determines that the calculation of the subsequent time must wait for the completion of the previous time,and the entire calculation process cannot be parallelized at the same time.Moreover,when the sequence length is very large,long-distance information attenuation is unavoidable.In contrast,TCN is based on Convolutional Neural Network(CNN),so there is no time-dependent information attenuation.Meanwhile,the convolution kernels are independent of each other,and its iterative calculation can be performed in parallel.In addition,we can flexibly control the size of the perception domain by changing the width of the convolution kernel or increasing or decreasing the number of network layers.Next,we use 1 × 1 convolution to integrate and refine the information of each sequence time dimension to enhance the learning ability of the model.Based on this,we apply the attention mechanism between the sequences to model the sequence correlation of the data.We compare the output of TCN with each sequence that has been integrated and refined,calculate the correlation degree,weight the average of all sequences to get the context vector,and combine it with the initial result to obtain the prediction result.Finally,we integrated traditional autoregressive linear modules into the model in parallel.For one thing,the data has certain linear features,especially in simple data,the proportion of linear features may be large.For the other thing,there is noise in the data,and the non-linear modules with high complexity may learn features that the data itself does not have.As the result,the fusion of the prediction results of the nonlinear module and the prediction results of the linear module can effectively enhance the robustness of the model.We will carry out experiments on the proposed TCNA on four data sets of exchange rate,electricity,solar energy,and traffic,compare the results with traditional time series methods and multivariate time series methods(LSTNet,TPA),and explore the necessity and importance of linear modules and attention mechanism through ablation study.We found that the attention mechanism plays a crucial role in the prediction of complex data,and linear modules enable our model to perform well on simple data.The experimental results show that our proposed TCNA successfully captures the time dependence and sequence correlation of the data,and can efficiently deal with multivariate time series prediction problems.TCNA has achieved a significant performance improvement over benchmark methods,especially on complex problems with long-time and multi-sequence.
Keywords/Search Tags:Multivariate Time Series, Temporal Convolution Network, Attention Mechanism, Auto-Regressive Model
PDF Full Text Request
Related items