The paper presents the stopping rule for random search for Bayesian model-structure estimation by maximising the likelihood function. The inspected maximisation uses random restarts to cope with local maxima in discrete space. The stopping rule, suitable for any maximisation of this type, exploits the probability of finding global maximum implied by the number of local maxima already found. It stops the search when this probability crosses a given threshold. The inspected case represents an important example of the search in a huge space of hypotheses so common in artificial intelligence, machine learning and computer science.
Binary Factor Analysis (BFA) aims to discover latent binary structures in high dimensional data. Parameter learning in BFA faces an exponential computational complexity and a large number of local optima. The model selection to determine the latent binary dimension is therefore difficult. Traditionally, it is implemented in two separate stages with two different objectives. First, parameter learning is performed for each candidate model scale to maximise the likelihood; then the optimal scale is selected to minimise a model selection criterion. Such a two-phase implementation suffers from huge computational cost and deteriorated learning performance on large scale structures. In contrast, the Bayesian Ying-Yang (BYY) harmony learning starts from a high dimensional model and automatically deducts the dimension during learning. This paper investigates model selection on a subclass of BFA called Orthogonal Binary Factor Analysis (OBFA). The Bayesian inference of the latent binary code is analytically solved, based on which a BYY machine is constructed. The harmony measure that serves as the objective function in BYY learning is more accurately estimated by recovering a regularisation term. Experimental comparison with the two-phase implementations shows superior performance of the proposed approach.