Abstract
We propose CMSMs, a novel type of generic compositional models for syntactic and semantic aspects of natural language, based on matrix multiplication. We argue for the structural and cognitive plausibility of this model and show that it is able to cover and combine various common compositional NLP approaches ranging from statistical word space models to symbolic grammar formalisms.
Keywords
Affiliated Institutions
Related Publications
Compositional Matrix-Space Models for Sentiment Analysis
We present a general learning-based approach for phrase-level sentiment analysis that adopts an ordinal sentiment scale and is explicitly compositional in nature. Thus, we can m...
Dependency-Based Construction of Semantic Space Models
Traditionally, vector-based semantic space models use word co-occurrence counts from large corpora to represent lexical meaning. In this article we present a novel framework for...
A structured vector space model for word meaning in context
We address the task of computing vector space representations for the meaning of word occurrences, which can vary widely according to context. This task is a crucial step toward...
Glove: Global Vectors for Word Representation
Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arithmetic, but the o...
Estimating Linear Models for Compositional Distributional Semantics
In distributional semantics studies, there is a growing attention in compositionally determining the distributional meaning of word sequences. Yet, compositional distributional ...
Publication Info
- Year
- 2010
- Type
- article
- Pages
- 907-916
- Citations
- 53
- Access
- Closed