| In recent years,machine reading comprehension has become a hot topic in natural language processing.Machine reading comprehension uses algorithms to make machines understand the semantics of articles and answer relevant questions.Reading comprehension is inseparable from problem generation and problem generation.Question generation is designed to generate human-like questions by inputting a sequence of text.The purpose of question answering is to answer the questions of the text by entering a sequence of text.With the rapid development of deep learning technology and the publication of large-scale reading comprehension data sets,the related research of machine reading comprehension has made remarkable progress,but it still faces many challenges.At present,the problem in answering questions is that the generalization ability is poor.The machine is good in answering certain fields but poor in other fields.The sentence vector obtained by the model is not uniform in the spatial distribution,which will lead to the omission of some key information in the final result.The model generated by the problem has the problem of poor learning ability,and the generated problem is difficult to make people understand.Aiming at the above problems,the main research contents of this paper include:(1)In terms of question answer,it aims at the uneven distribution of sentence vectors and the poor generalization ability of reading comprehension models.In this paper,we propose a method to train neural network model by contrast learning and embed Simple Memory Module(SSM)into neural network model.Simple Contrast learning for Sentence Embedding(Sim CSE: Simple Contrastive Learning of Sentence Embeddings(MSD)solves the problem of uneven sentence vector distribution by narrowing the distance of similar data and lengthening the distance of dissimilar data to better learn the representation of data.The experimental results show that the model with contrast learning is better than the original model in sentence vector distribution.The simple memory module is embedded into the neural network model.The simple memory module mimics human memory behavior and recalls questions when answering them.If similar memories exist,it outputs them;if not,it records questions and answers.Simple memory module can record problems in different fields,and there will be some similar problems in different fields.This method can solve the problem of poor generalization ability of the model.Based on the Bidirectional Encoder Representation from Transformers(BERT)model,this paper adds comparative learning and simple memory modules to form Sim CSE-BERT-SSM model.Experiments show that the learning and generalization ability of the model using simple memory module is better than other models.(2)The model generated for the problem has poor learning ability.In this paper,the adversarial learning method is used to combine the question generation model and the question answering model.The Uni LM(Unified Language Model Pre-training for Natural Language Understanding and Generation)model was used to build the problem generation module.The question answering module uses the Sim CSE-BERT-SSM model proposed above to input the result generated by the question into the question answering for verification so as to improve the learning ability of the model.Experimental results show that the model proposed in this paper is better than the original model after supervised learning training. |