| Graph Neural Networks have achieved good results in tasks such as node classification and link prediction,however,there is still much room for improvement in the effectiveness of graph classification tasks.Graph classification algorithms can be divided into two categories,namely graph-pooling-based graph classification algorithms and node-voting-based graph classification algorithms.However,current graph-pooling-based graph classification algorithms lose a lot of feature and structure information when pooling,and they do not consider task relevance when generating graph features,so that the generated graph features are not expressive and affect the accuracy of graph classification.The current node-voting graph classification algorithm does not allow nodes to feel rich enough global graph information,which leads to the lack of objective judgment of each node on the graph category and affects the accuracy of the final graph classification.To address the above problems,this paper improves the graph-pooling-based graph classification algorithm and the node-voting-based graph classification algorithm,respectively,as follows.A graph pooling-based graph classification algorithm called Local Capsule Pooling Network(LCPN)is proposed.LCPN uses graph convolution combined with Local Capsule Pooling(LCP)to learn node features with hierarchical information,and then uses a task-associated graph readout mechanism(TAR).Task Aware Readout(TAR)is used to generate high-quality graph features,which are then fed into a multilayer perceptron to classify the graph,and the model is trained with both classification loss and Pooling Information Loss(PIL).TAR gives more weights to taskrelated nodes according to the task,and weighted summation of node features to generate PIL further reduces the information loss from graph pooling by reducing the perturbation of node feature distribution before and after graph pooling.LCPN generates more representational graph features with less information loss during graph pooling,and shows excellent performance in graph classification experiments,adjacency matrix visualization experiments,and node feature reconstruction experiments.A Transformer-based Node-Voting Graph Classification algorithm,TNVC(Transformerbased Node-Voting Classifying),is proposed.TNVC uses Structural Attention(SA)as the selfattentive mechanism in Transformer.The TNVC uses Structural Attention(SA)as the bias of the self-attention in the Transformer,and uses the Transformer with SA added to let the nodes learn the global graph information.Then,TNVC allows the nodes to predict the categories of the graph separately.Finally,TNVC combines the judgments of all nodes to calculate the final graph classification results.Among them,SA considers the multi-order adjacency relationship among nodes and encodes the probabilities of nodes visiting each other at different orders into structural attention values to modify the self-attention mechanism of Transformer,so that Transformer has the ability to sense graph structural information and can process graph data well.Since the Transformer has the ability to perceive global information,the Transformer with SA added instead of graph neural network for node feature learning solves the problem that nodes cannot perceive rich enough global graph information in previous node-voting graph classification algorithms,thus allowing each node to make more accurate judgments about graph categories.The test results on several graph classification datasets show that TNVC demonstrates better performance than other node-voting graph classification models. |