Font Size: a A A

The Researches Of Syntactic Information Enhanced Neural Conversation Models

Posted on:2018-02-20Degree:MasterType:Thesis
Country:ChinaCandidate:C W LuoFull Text:PDF
GTID:2348330515989687Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
Response generation is a very hard and important problem in natural language processing.It is one of the natural language generation problems.Response generation includes the natural language understanding and reasoning.In recent years,conversational agents that are capable of generating human-like response are becoming more and more popular.Given a conversational post,and generating an appropriate response.This is response generation.With the progress of deep-learning,many researchers use the Sequence-to-Sequence(Seq2Seq)neural network to model the conversation and develop neural conversation models.As we know,modeling syntactic information of sentence is essential for neural response generation models to produce appropriate response sentences of high linguistic quality.However,no previous work in conversational response generation using Seq2Seq neural network models has reported to take the sentence syntactic information into account.In this paper,we incorporate the syntactic information into Seq2Seq neural conversation model.We first present two part-of-speech enhanced neural conversation models that represent the sequence syntactic information by POS tags sequence,and incorporate this information into Seq2Seq neural conversation models.Then,we develop two dependency-parsing-based neural conversation models that generate the response by first generating the root word of the dependency tree,then constructing the whole sentence by this root word with attention mechanism incorporated.Finally,we combine the models above to form our syntactic information enhanced neural conversation models.The experiment results show that our models can generate more grammatically correct and appropriate responses in terms of Perplexity,BLEU and human evaluations when compared with other Seq2Seq neural conversation models.
Keywords/Search Tags:Neural Conversation Model, Response Generation, Syntactic Information, Attention Mechanism
PDF Full Text Request
Related items