Abstract

Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arithmetic, but the origin of these regularities has remained opaque. We analyze and make explicit the model properties needed for such regularities to emerge in word vectors. The result is a new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods. Our model efficiently leverages statistical information by training only on the nonzero elements in a word-word cooccurrence matrix, rather than on the entire sparse matrix or on individual context windows in a large corpus. The model produces a vector space with meaningful substructure, as evidenced by its performance of 75% on a recent word analogy task. It also outperforms related models on similarity tasks and named entity recognition.

Keywords

Computer scienceRepresentation (politics)Word (group theory)Natural language processingArtificial intelligenceHuman–computer interactionComputer graphics (images)Linguistics

Affiliated Institutions

Related Publications

Finding Structure in Time

Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implici...

1990 Cognitive Science 10427 citations

Publication Info

Year
2014
Type
article
Pages
1532-1543
Citations
32840
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

32840
OpenAlex

Cite This

Jeffrey Pennington, Richard Socher, Christopher D. Manning (2014). Glove: Global Vectors for Word Representation. , 1532-1543. https://doi.org/10.3115/v1/d14-1162

Identifiers

DOI
10.3115/v1/d14-1162