Volume 12, Number 1

Understanding Negative Sampling in Knowledge Graph Embedding

  Authors

Jing Qian1, 2, Gangmin Li1, Katie Atkinson2 and Yong Yue1, 1Xi’an Jiaotong-Liverpool University, China, 2University of Liverpool, United Kingdom

  Abstract

Knowledge graph embedding (KGE) is to project entities and relations of a knowledge graph (KG) into a low-dimensional vector space, which has made steady progress in recent years. Conventional KGE methods, especially translational distance-based models, are trained through discriminating positive samples from negative ones. Most KGs store only positive samples for space efficiency. Negative sampling thus plays a crucial role in encoding triples of a KG. The quality of generated negative samples has a direct impact on the performance of learnt knowledge representation in a myriad of downstream tasks, such as recommendation, link prediction and node classification. We summarize current negative sampling approaches in KGE into three categories, static distribution-based, dynamic distribution-based and custom cluster-based respectively. Based on this categorization we discuss the most prevalent existing approaches and their characteristics. It is a hope that this review can provide some guidelines for new thoughts about negative sampling in KGE.

  Keywords

Negative Sampling, Knowledge Graph Embedding, Generative Adversarial Network.