HittER: Hierarchical Transformers for Knowledge Graph Embeddings
- Sanxing Chen ,
- Xiaodong Liu ,
- Jianfeng Gao ,
- Jian Jiao ,
- Ruofei Zhang ,
- Yangfeng Ji
This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity’s neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from the outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Evaluated on the task of link prediction, our approach achieves new state-of-the-art results on two standard benchmark datasets FB15K-237 and WN18RR.
Téléchargements de publications
HittER: Hierarchical Transformers for Knowledge Graph Embeddings [Code]
novembre 30, 2021
HittER generates embeddings for large-scale knowledge graphs and performs link prediction using a hierarchical Transformer model. It appeared in EMNLP 2021.