We propose a new additive decomposition of probability tables - {\em tensor rank-one decomposition}. The basic idea is to decompose a {\em probability table} into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables are allowed to be any real number, i. e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that {\em tensor rank-one decomposition} can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.
Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy ℓ-out-of-k functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing ℓ-out-of-k functions. We propose an approximation of tensors representing noisy ℓ-out-of-k functions by a sum of r tensors of rank one, where r is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.