4.7 Article

Recursive conditioning

Journal

ARTIFICIAL INTELLIGENCE
Volume 126, Issue 1-2, Pages 5-41

Publisher

ELSEVIER
DOI: 10.1016/S0004-3702(00)00069-2

Keywords

Bayesian networks; probabilistic inference; time-space tradeoff; conditioning methods; decompositional reasoning

Ask authors/readers for more resources

We introduce an any-space algorithm for exact inference in Bayesian networks, called recursive conditioning. On one extreme, recursive conditioning takes O(n) space and O(n exp(w log n)) time-where n is the size of a Bayesian network and u, is the width of a given elimination order-therefore, establishing a new complexity result for linear-space inference in Bayesian networks. On the other extreme, recursive conditioning takes O(n exp(w)) space and O(n exp(w)) time, therefore, matching the complexity of state-of-the-art algorithms based on clustering and elimination. In between linear and exponential space, recursive conditioning can utilize memory at increments of X-bytes, where X is the number of bytes needed to store a floating point number in a cache. Moreover, the algorithm is equipped with a formula for computing its average running time under any amount of space, hence, providing a valuable tool for time-space tradeoffs in demanding applications. Recursive conditioning is therefore the first algorithm for exact inference in Bayesian networks to offer a smooth tradeoff between time and space, and to explicate a smooth, quantitative relationship between these two important resources. (C) 2001 Elsevier Science B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available