Publication:
TuckER: Tensor Factorization for Knowledge Graph Completion
Ivana Balazevic, Carl Allen, Timothy M. Hospedales • @Conference on Empirical Methods in Natural Language Processing • 28 January 2019
TLDR: This work proposes TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples that outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models.
Citations: 589
Abstract: Knowledge graphs are structured representations of real world facts. However, they typically contain only a small subset of all possible facts. Link prediction is a task of inferring missing facts based on existing ones. We propose TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples. TuckER outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models. We show that TuckER is a fully expressive model, derive sufficient bounds on its embedding dimensionalities and demonstrate that several previously introduced linear models can be viewed as special cases of TuckER.
Related Fields of Study
loading
Citations
Sort by
Previous
Next
Showing results 1 to 0 of 0
Previous
Next
References
Sort by
Previous
Next
Showing results 1 to 0 of 0
Previous
Next