4.3 Article

Inference and learning in probabilistic logic programs using weighted Boolean formulas

Journal

THEORY AND PRACTICE OF LOGIC PROGRAMMING
Volume 15, Issue -, Pages 358-401

Publisher

CAMBRIDGE UNIV PRESS
DOI: 10.1017/S1471068414000076

Keywords

probabilistic logic programming; probabilistic inference; parameter learning

Funding

  1. Research Foundation-Flanders (FWO-Vlaanderen)
  2. European Commission [FP7-248258-First-MM]
  3. [PF-10/010 NATAR]

Ask authors/readers for more resources

Probabilistic logic programs are logic programs in which some of the facts are annotated with probabilities. This paper investigates how classical inference and learning tasks known from the graphical model community can be tackled for probabilistic logic programs. Several such tasks, such as computing the marginals, given evidence and learning from (partial) interpretations, have not really been addressed for probabilistic logic programs before. The first contribution of this paper is a suite of efficient algorithms for various inference tasks. It is based on the conversion of the program and the queries and evidence to a weighted Boolean formula. This allows us to reduce inference tasks to well-studied tasks, such as weighted model counting, which can be solved using state-of-the-art methods known from the graphical model and knowledge compilation literature. The second contribution is an algorithm for parameter estimation in the learning from interpretations setting. The algorithm employs expectation-maximization, and is built on top of the developed inference algorithms. The proposed approach is experimentally evaluated. The results show that the inference algorithms improve upon the state of the art in probabilistic logic programming, and that it is indeed possible to learn the parameters of a probabilistic logic program from interpretations.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available