Kybernetika 47 no. 3, 317-336, 2011

Rank of tensors of l-out-of-k functions: an application in probabilistic inference

Jiří Vomlel

Abstract:

Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy $\ell$-out-of-$k$ functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing $\ell$-out-of-$k$ functions. We propose an approximation of tensors representing noisy $\ell$-out-of-$k$ functions by a sum of $r$ tensors of rank one, where $r$ is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.

Keywords:

Bayesian networks, probabilistic inference, tensor rank, uncertainty in artificial intelligence, symmetric rank, border rank

Classification:

68T37, 62E15, 15A69