Ensemble-Based Fact Classification with Knowledge Graph Embeddings

نویسندگان

چکیده

Numerous prior works have shown how we can use Knowledge Graph Embeddings (KGEs) for ranking unseen facts that are likely to be true. Much less attention has been given on KGEs fact classification, i.e., mark either as true or false. In this paper, tackle problem with a new technique exploits ensemble learning and weak supervision, following the principle multiple classifiers make strong one. Our method is implemented in system called $$\mathsf {DuEL}$$ . post-processes ranked lists produced by embedding models classifiers, which include supervised like LSTMs, MLPs, CNNs unsupervised ones consider subgraphs reachability graph. The output of these aggregated using weakly does not need ground truths, would expensive obtain. experiments show produces more accurate classification than other existing methods, improvements up 72% terms $$F_1$$ score. This suggests promising perform KGEs.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convolutional 2D Knowledge Graph Embeddings

Link prediction for knowledge graphs is the task of predicting missing relationships between entities. Previous work on link prediction has focused on shallow, fast models which can scale to large knowledge graphs. However, these models learn less expressive features than deep, multi-layer models – which potentially limits performance. In this work we introduce ConvE, a multi-layer convolutiona...

متن کامل

Proximity-based Graph Embeddings for Multi-label Classification

In many real applications of text mining, information retrieval and natural language processing, large-scale features are frequently used, which often make the employed machine learning algorithms intractable, leading to the well-known problem “curse of dimensionality”. Aiming at not only removing the redundant information from the original features but also improving their discriminating abili...

متن کامل

Towards Lexical Chains for Knowledge-Graph-based Word Embeddings

Word vectors with varying dimensionalities and produced by different algorithms have been extensively used in NLP. The corpora that the algorithms are trained on can contain either natural language text (e.g. Wikipedia or newswire articles) or artificially-generated pseudo corpora due to natural data sparseness. We exploit Lexical Chain based templates over Knowledge Graph for generating pseudo...

متن کامل

Inducing Interpretability in Knowledge Graph Embeddings

We study the problem of inducing interpretability in KG embeddings. Specifically, we explore the Universal Schema (Riedel et al., 2013) and propose a method to induce interpretability. There have been many vector space models proposed for the problem, however, most of these methods don’t address the interpretability (semantics) of individual dimensions. In this work, we study this problem and p...

متن کامل

Fast Linear Model for Knowledge Graph Embeddings

This paper shows that a simple baseline based on a Bag-of-Words (BoW) representation learns surprisingly good knowledge graph embeddings. By casting knowledge base completion and question answering as supervised classification problems, we observe that modeling co-occurences of entities and relations leads to state-of-the-art performance with a training time of a few minutes using the open sour...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2022

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-06981-9_9