| Dynamic networks are time-varying data structures that describe the complex interactions and evolutionary patterns among entities in real systems,such as social networks,traffic networks,and citation networks,among others.Therefore,how to represent network information effectively becomes a problem in research.Network Representation Learning(NRL),also known as graph embedding,aims at transforming high-dimensional and sparse network structure data into low-dimensional and dense forms.By this method,network data can be represented more effectively and applied to various downstream applications.However,traditional network characterization methods are only applicable to static networks.In real life,most networks are dynamically changing,including the increase or decrease of nodes and edges,and the change of node and edge attribute information,which makes dynamic network representation learning an important research direction.Therefore,we need to develop new dynamic network representation learning methods to handle dynamic network data and to be able to capture the temporal and structural features in dynamic network data.In this paper,we propose three different methods to perform representation learning on dynamic networks.First,a dynamic network representation learning method LARW(Network Representation Learning Algorithm Based On Long Anonymous Random Walks)is proposed in this paper.It is based on self-attentive mechanism,anonymous random walks and long sequence time series prediction.By encoding and aggregating all nodes in the set of walks,the characteristics of the nodes that issued the walks can be obtained.Superior to the combined long sequence time prediction method,this method can improve the predictability of node embedding by increasing the length of the wandering sequence.Second,this paper proposes a dynamic network characterization method EGCTN(Evolve Graph Convolutional Transformer Network)based on evolutionary graph convolution.This method uses the Evolve Graph Convolutional Neural Network to capture the relationship between nodes and the overall structure of the network,while using the Transformer model to update the GCN parameters iterative weight matrix to better capture the dynamic changes in the network.Compared with the traditional RNNlike approach,the self-attentive mechanism in Transformer enables parallel computation,which greatly improves the computational efficiency and reduces the training time of the model.It also has better representation capability and better interpretation.Finally,this paper proposes a dynamic network characterization method TGNUPS(Temporal Graph Networks Use ProbSparse)based on ProbSparse attention mechanism.It is a general graph neural network framework that uses a node memory module to store the historical interaction information of nodes stored as historical messages,and uses message functions,message aggregators to process the interaction information,and finally updated by the node memory update module.It also uses ProbSparse selfattentive model to iterate over the parameters,avoiding the need to call a large amount of data and the computational effort of back propagation for each node embedding update. |