Abstract
Discriminative learning methods are widely used in natural language processing. These methods work best when their training and test data are drawn from the same distribution. For many NLP tasks, however, we are confronted with new domains in which labeled data is scarce or non-existent. In such cases, we seek to adapt existing models from a resource-rich source domain to a resource-poor target domain. We introduce structural correspondence learning to automatically induce correspondences among features from different domains. We test our technique on part of speech tagging and show performance gains for varying amounts of source and target training data, as well as improvements in target domain parsing accuracy using our improved tagger.
Keywords
Affiliated Institutions
Related Publications
Analysis of Representations for Domain Adaptation
Discriminative learning methods for classification perform well when training and test data are drawn from the same distribution. In many situations, though, we have labeled tra...
Moment Matching for Multi-Source Domain Adaptation
Conventional unsupervised domain adaptation (UDA) assumes that training data are sampled from a single domain. This neglects the more practical scenario where training data are ...
Learning to Generalize: Meta-Learning for Domain Generalization
Domain shift refers to the well known problem that a model trained in one source domain performs poorly when appliedto a target domain with different statistics. Domain Generali...
Unsupervised Domain Adaptation by Domain Invariant Projection
Domain-invariant representations are key to addressing the domain shift problem where the training and test examples follow different distributions. Existing techniques that hav...
Domain-Adversarial Neural Networks
We introduce a new representation learning algorithm suited to the context of domain adaptation, in which data at training and test time come from similar but different distribu...
Publication Info
- Year
- 2006
- Type
- article
- Pages
- 120-120
- Citations
- 1550
- Access
- Closed
External Links
Social Impact
Social media, news, blog, policy document mentions
Citation Metrics
Cite This
Identifiers
- DOI
- 10.3115/1610075.1610094