Breast pathology Whole Slide Images(WSI)classification and lesion localization are challenging tasks in computational pathology due to the need for adequate histopathological context-aware feature representation.In view of the characteristics of WSI such as multi-scale,large size and difficult labeling,most of the existing algorithms are designed based on the Multiple Instance Learning(MIL)method,because this method only needs slide-level labels to achieve pathological WSI classification.However,the MIL method regards the image patches at different positions in WSI as independent and identically distributed regions,which leads to the inability of existing algorithms to effectively use the spatial position and context-aware information of image patches to realize WSI recognition classification and lesion localization.In response to the above problems,this paper proposes a whole slide image recognition and classification model based on Graph Convolutional Network Attention-Based Multiple Instance Learning(ABMIL-GCN).The method first uses Res Net50 to extract the features of WSI image patches and construct WSI-level graph data.Secondly,the graph convolution network is used to simulate the local and global topological structure of WSI image patches,fully retain the spatial position and context information of image patches on the original WSI,and establish the dependency relationship between image patch features.Finally,the multi-instance pooling network based on the gated attention mechanism is used to quantify the attention score of each image patch feature to achieve WSI-level feature aggregation and heat map visualization.In addition,this paper also proposes to use the flood optimization method(flooding)to further optimize the performance of the ABMIL-GCN model.Experiments show that the average accuracy and AUC value of the optimized ABMIL-GCN model on the Camelyon16 test set are increased to 90.89% and 0.9149 respectively,the ABMIL-GCN method outperforms existing weakly supervised learning algorithms.Although the ABMIL-GCN model can establish dependencies between image patches,the context information it obtains is limited by the adjacency matrix in graph convolution network,and can only establish context dependencies between the current image patch and neighboring image patches.Therefore,in order to obtain the spatial location and context information of image patches in a larger range,this paper proposes a whole slide image recognition and classification model based on Bidirectional Gated Recurrent Unit Attention-Based Multiple Instance Learning(ABMIL-Bi GRU).This method first scans WSI by row and column to extract image patches and uses Res Net50 to extract image patch features,and then uses bidirectional gated recurrent units to establish long-short distance dependencies between image patch row features and column features and perform feature splicing to realize image patch spatial location and context-aware information embedding of a wider range.Finally,the multi-instance pooling network based on the gated attention mechanism is used to quantify the attention score of each concatenated feature to achieve WSI-level feature aggregation and heat map visualization.Experiments show that the average accuracy and AUC value of the ABMIL-Bi GRU model on the Camelyon16 test set reached 91.86% and 0.9467,respectively.In comparison with other methods,the superiority of the ABMIL-Bi GRU model was verified. |