On Prototypical Indifference and Lifted Inference in Relational Probabilistic Conditional Logic

نویسنده

  • Matthias Thimm
چکیده

Semantics for formal models of probabilistic reasoning rely on probability functions that are defined on the interpretations of the underlying classical logic. When this underlying logic is of relational nature, i. e. a fragment of first-order logic, then the space needed for representing these probability functions explicitly is exponential in both the number of predicates and the number of domain elements. Consequently, probabilistic reasoning becomes a demanding task. Here, we investigate lifted inference in the context of explicit model representation with respect to an inference operator that satisfies prototypical indifference, i. e. an inference operator that is indifferent about individuals for which the same information is represented. As reasoning based on the principle of maximum entropy satisfies this property we exemplify our ideas by compactly characterizing the maximum entropy model of a probabilistic knowledge base in a relational probabilistic conditional logic. Our results show that lifted inference is no longer exponential in the number of domain elements when we restrict the language to unary predicates but is still infeasible for the general case.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Evolving Knowledge in Theory and Applications

Semantics for formal models of probabilistic reasoning rely on probability functions that are defined on the interpretations of the underlying classical logic. When this underlying logic is of relational nature, i. e. a fragment of first-order logic, then the space needed for representing these probability functions explicitly is exponential in both the number of predicates and the number of do...

متن کامل

Lifted Inference and Learning in Statistical Relational Models

Statistical relationalmodels combine aspects of first-order logic andprobabilistic graphical models, enabling them to model complex logical and probabilistic interactions between large numbers of objects. This level of expressivity comes at the cost of increased complexity of inference, motivating a new line of research in lifted probabilistic inference. By exploiting symmetries of the relation...

متن کامل

Lifted Discriminative Learning of Probabilistic Logic Programs

Probabilistic logic programming (PLP) provides a powerful tool for reasoning with uncertain relational models. However, learning probabilistic logic programs is expensive due to the high cost of inference. Among the proposals to overcome this problem, one of the most promising is lifted inference. In this paper we consider PLP models that are amenable to lifted inference and present an algorith...

متن کامل

Understanding the Complexity of Lifted Inference and Asymmetric Weighted Model Counting

In this paper we study lifted inference for the Weighted First-Order Model Counting problem (WFOMC), which counts the assignments that satisfy a given sentence in firstorder logic (FOL); it has applications in Statistical Relational Learning (SRL) and Probabilistic Databases (PDB). We present several results. First, we describe a lifted inference algorithm that generalizes prior approaches in S...

متن کامل

Lifted Variable Elimination for Probabilistic Logic Programming

Lifted inference has been proposed for various probabilistic logical frameworks in order to compute the probability of queries in a time that depends on the size of the domains of the random variables rather than the number of instances. Even if various authors have underlined its importance for probabilistic logic programming (PLP), lifted inference has been applied up to now only to relationa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012