Font Size: a A A

A Study On Semantic Relation Representations In Neural Word Embedding

Posted on:2018-02-21Degree:M.SType:Thesis
University:The Florida State UniversityCandidate:Chen, ZhiweiFull Text:PDF
GTID:2478390020457680Subject:Computer Science
Abstract/Summary:PDF Full Text Request
Neural network based word embeddings have demonstrated outstanding results in a variety of tasks, and become a standard input for Natural Language Processing (NLP) related deep learning methods. Despite these representations are able to capture semantic regularities in languages, some general questions, e.g., "what kinds of semantic relations do the embeddings represent?" and "how could the semantic relations be retrieved from an embedding?" are not clear and very little relevant work has been done. In this study, we propose a new approach to exploring the semantic relations represented in neural embeddings based onWordNet and Unified Medical Language System (UMLS). Our study demonstrates that neural embeddings do prefer some semantic relations and that the neural embeddings also represent diverse semantic relations. Our study also finds that the Named Entity Recognition (NER)-based phrase composition outperforms Word2phrase and the word variants do not affect the performance on analogy and semantic relation tasks.
Keywords/Search Tags:Semantic, Word, Neural, Embeddings
PDF Full Text Request
Related items