Knowledge Graph Embeddings in Geometric Algebras

Author: C. Xu, M. Nayyeri, Y. Chen, J. Lehmann
Journal: Proceedings of the 28th International Conference on Computational Linguistics
Year: 2020

Citation information

C. Xu, M. Nayyeri, Y. Chen, J. Lehmann,
Proceedings of the 28th International Conference on Computational Linguistics,
2020,
530–544,
International Committee on Computational Linguistics,
http://dx.doi.org/10.18653/v1/2020.coling-main.46

Knowledge graph (KG) embedding aims at embedding entities and relations in a KG into a lowdimensional latent representation space. Existing KG embedding approaches model entities andrelations in a KG by utilizing real-valued , complex-valued, or hypercomplex-valued (Quaternionor Octonion) representations, all of which are subsumed into a geometric algebra. In this work,we introduce a novel geometric algebra-based KG embedding framework, GeomE, which uti-lizes multivector representations and the geometric product to model entities and relations. Ourframework subsumes several state-of-the-art KG embedding approaches and is advantageouswith its ability of modeling various key relation patterns, including (anti-)symmetry, inversionand composition, rich expressiveness with higher degree of freedom as well as good general-ization capacity. Experimental results on multiple benchmark knowledge graphs show that theproposed approach outperforms existing state-of-the-art models for link prediction.