Abstract

Classic methods for clinical temporal relation extraction focus on relational candidates within a sentence. On the other hand, break-through Bidirectional Encoder Representations from Transformers (BERT) are trained on large quantities of arbitrary spans of contiguous text instead of sentences. In this study, we aim to build a sentence-agnostic framework for the task of CONTAINS temporal relation extraction. We establish a new state-of-the-art result for the task, 0.684F for in-domain (0.055-point improvement) and 0.565F for cross-domain (0.018-point improvement), by fine-tuning BERT and pre-training domain-specific BERT models on sentence-agnostic temporal relation instances with WordPiece-compatible encodings, and augmenting the labeled data with automatically generated “silver” instances.

Keywords

Computer scienceSentenceFocus (optics)Artificial intelligenceEncoderTransformerRelationship extractionTask (project management)Natural language processingRelation (database)Domain (mathematical analysis)Point (geometry)Information extractionData miningMathematics

Related Publications

Publication Info

Year
2019
Type
article
Citations
77
Access
Closed

External Links

Social Impact

Altmetric
A
PlumX Metrics

Social media, news, blog, policy document mentions

Citation Metrics

77
OpenAlex

Cite This

Chen Lin, Timothy A. Miller, Dmitriy Dligach et al. (2019). A. . https://doi.org/10.18653/v1/w19-1908

Identifiers

DOI
10.18653/v1/w19-1908