Context-Aware Representations for Knowledge Base Relation Extraction

EMNLP 2017  ·  Daniil Sorokin, Iryna Gurevych ·

We demonstrate that for sentence-level relation extraction it is beneficial to consider other relations in the sentential context while predicting the target relation. Our architecture uses an LSTM-based encoder to jointly learn representations for all relations in a single sentence. We combine the context representations with an attention mechanism to make the final prediction. We use the Wikidata knowledge base to construct a dataset of multiple relations per sentence and to evaluate our approach. Compared to a baseline system, our method results in an average error reduction of 24 on a held-out set of relations. The code and the dataset to replicate the experiments are made available at \url{https://github.com/ukplab/}.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper
Task Dataset Model Metric Name Metric Value Global Rank Benchmark
Relation Extraction Wikipedia-Wikidata relations ContextAtt Error rate 0.1590 # 1

Methods


No methods listed for this paper. Add relevant methods here