We propose a new additive decomposition of probability tables - {\em tensor rank-one decomposition}. The basic idea is to decompose a {\em probability table} into a series of tables, such that the table that is the sum of the series is equal to the original table. Each table in the series has the same domain as the original table but can be expressed as a product of one-dimensional tables. Entries in tables are allowed to be any real number, i. e. they can be also negative numbers. The possibility of having negative numbers, in contrast to a multiplicative decomposition, opens new possibilities for a compact representation of probability tables. We show that {\em tensor rank-one decomposition} can be used to reduce the space and time requirements in probabilistic inference. We provide a closed form solution for minimal tensor rank-one decomposition for some special tables and propose a numerical algorithm that can be used in cases when the closed form solution is not known.
In this paper, we generalize the noisy-or model. The generalizations are three-fold. First, we allow parents to be multivalued ordinal variables. Second, parents can have both positive and negative influences on their common child. Third, we describe how the suggested generalization can be extended to multivalued child variables. The major advantage of our generalizations is that they require only one parameter per parent. We suggest a model learning method and report results of experiments on the Reuters text classification data. The generalized noisy-or models achieve equal or better performance than the standard noisy-or. An important property of the noisy-or model and of its generalizations suggested in this paper is that it allows more efficient exact inference than logistic regression models do.
Bayesian networks became a popiilar framework for reasoning with
uncertainty. Efficient methods have been developed for probabilistic reasoning with new evidence. However, when new evidence is nncertain or imprecise, different methods have been proposed. The original contribution of this paper are guidelines for the treatment of different types of uncertain evidence, the rules for combining evidence from different sources, and the model revision with nncertain evidence.
Bayesian networks are a popular model for reasoning under uncertainty. We study the problem of efficient probabilistic inference with these models when some of the conditional probability tables represent deterministic or noisy ℓ-out-of-k functions. These tables appear naturally in real-world applications when we observe a state of a variable that depends on its parents via an addition or noisy addition relation. We provide a lower bound of the rank and an upper bound for the symmetric border rank of tensors representing ℓ-out-of-k functions. We propose an approximation of tensors representing noisy ℓ-out-of-k functions by a sum of r tensors of rank one, where r is an upper bound of the symmetric border rank of the approximated tensor. We applied the suggested approximation to probabilistic inference in probabilistic graphical models. Numerical experiments reveal that we can get a gain in the order of two magnitudes but at the expense of a certain loss of precision.