Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction
Papaluca Andrea; Lenskiy Artem; Krefl Daniel; Suominen Hanna
Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction
Papaluca Andrea
Lenskiy Artem
Krefl Daniel
Suominen Hanna
Julkaisun pysyvä osoite on:
https://urn.fi/URN:NBN:fi-fe2022102463141
https://urn.fi/URN:NBN:fi-fe2022102463141
Tiivistelmä
In this work we put forward to combine pre-trained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing. Our proposed model is based on a simple variation of existing models to incorporate off-task pre-trained graph embeddings with an on-task finetuned BERT encoder. We perform a detailed statistical evaluation of the model on standard datasets. We provide evidence that the added graph embeddings improve the performance, making such a simple approach competitive with the state-of-the-art models that perform explicit on-task training of the graph embeddings. Furthermore, we observe for the underlying BERT model an interesting power-law scaling behavior between the variance of the F1 score obtained for a relation class and its support in terms of training examples.
Kokoelmat
- Rinnakkaistallenteet [19207]