Abstract

Vector-space word representations have been very successful in recent years at improving performance across a variety of NLP tasks. However, common to most existing work, words are regarded as independent entities without any explicit relationship among morphologically related words being modeled. As a result, rare and complex words are often poorly estimated, and all unknown words are represented in a rather crude way using only one or a few vectors. This paper addresses this shortcoming by proposing a novel model that is capable of building representations for morphologically complex words from their morphemes. We combine recursive neural networks (RNNs), where each morpheme is a basic unit, with neural language models (NLMs) to consider contextual information in learning morphologicallyaware word representations. Our learned models outperform existing word representations by a good margin on word similarity tasks across many datasets, including a new dataset we introduce focused on rare words to complement existing ones in an interesting way.

Keywords

MorphemeComputer scienceWord (group theory)Artificial intelligenceMargin (machine learning)Natural language processingSimilarity (geometry)Recurrent neural networkVariety (cybernetics)Complement (music)Artificial neural networkSpace (punctuation)Machine learningLinguistics

Affiliated Institutions

Related Publications

Publication Info

Year
2013
Type
article
Pages
104-113
Citations
810
Access
Closed

External Links

Citation Metrics

810
OpenAlex

Cite This

Thang Luong, Richard Socher, Christopher D. Manning (2013). Better Word Representations with Recursive Neural Networks for Morphology. Conference on Computational Natural Language Learning , 104-113.