Probabilistic mixtures provide flexible "universal'' approximation of probability density functions. Their wide use is enabled by the availability of a range of efficient estimation algorithms. Among them, quasi-Bayesian estimation plays a prominent role as it runs "naturally'' in one-pass mode. This is important in on-line applications and/or extensive databases. It even copes with dynamic nature of components forming the mixture. However, the quasi-Bayesian estimation relies on mixing via constant component weights. Thus, mixtures with dynamic components and dynamic transitions between them are not supported. The present paper fills this gap. For the sake of simplicity and to give a better insight into the task, the paper considers mixtures with known components. A general case with unknown components will be presented soon.
The paper presents the stopping rule for random search for Bayesian model-structure estimation by maximising the likelihood function. The inspected maximisation uses random restarts to cope with local maxima in discrete space. The stopping rule, suitable for any maximisation of this type, exploits the probability of finding global maximum implied by the number of local maxima already found. It stops the search when this probability crosses a given threshold. The inspected case represents an important example of the search in a huge space of hypotheses so common in artificial intelligence, machine learning and computer science.
The paper solves the problem of minimization of the Kullback divergence between a partially known and a completely known probability distribution. It considers two probability distributions of a random vector (u1,x1,...,uT,xT) on a sample space of 2T dimensions. One of the distributions is known, the other is known only partially. Namely, only the conditional probability distributions of xτ given u1,x1,...,uτ−1,xτ−1,uτ are known for τ=1,...,T. Our objective is to determine the remaining conditional probability distributions of uτ given u1,x1,...,uτ−1,xτ−1 such that the Kullback divergence of the partially known distribution with respect to the completely known distribution is minimal. Explicit solution of this problem has been found previously for Markovian systems in Karný \cite{Karny:96a}. The general solution is given in this paper.