| Our country has fully entered the 5G era.Compared with the past,we have good conditions for rapidly transmitting massive data.Sentiment classification is widely used in stock market analysis,consumer psychological analysis,enterprise market demand analysis and other aspects,and is closely related to people’s daily life.For coarse-grained sentence-level sentiment classification models,this paper proposes a joint training model for fusion of multi-type word embeddings.In terms of word embedding,unlike the previous Bert GCN that simply used Bert and its derived models as word embeddings to characterize text,this paper supplements Glove word embeddings for each dataset and expands the model’s word embedding types.In terms of model improvement,Graph SAGE is used to replace GCN,which enhances the generalization ability of the model in the classification of unknown nodes.Added the performance of DPCNN network for deep mining of glove word embeddings.The two types of word embeddings are combined as additional features for the text nodes to be classified.In order to enable the model to achieve co-training of the three networks,different learning rates are set to adapt to different types of networks.In order to ensure that the three sub-networks are trained at the same time,the graph size of each input of Graph SAGE is the same as that of DPCNN,and Ro Berta is a text node with the same Batch Size size.The loss of the overall model in the co-training process can be accurately transmitted to the three sub-networks: Graph SAGE,DPCNN,and Roberta.This enables each sub-network to maintain stable convergence during the training process.For fine-target-level sentiment analysis,this paper proposes a semantic cosine distance feature fusion model.Compared with the previous Lcf Bert,the input sequence of the model is transformed into the word embedding of the target word and Local text.This kind of processing can more conveniently use the attention mechanism to realize the feature interaction between the target word and the local text.In terms of model improvement,this paper uses the semantic cosine distance between each word in the Local text and the target word for dynamic weighting and masking.Lcf Bert can only use one of these two methods alone.It is solved that Lcf Bert may cause the features that are far away from the target word and close to the target word to be attenuated or obscured.In this paper,we propose an attention computation method based on summation of column embeddings to perform attention computation on target words,dynamically masked and dynamically weighted text features.In this way,deep feature interaction between target words,semantic cosine distance dynamic masking and weighted Local text features is realized,and the loss function of label smoothing is used to improve the model effect. |