The functional structure of our new network is not preset; instead, it
comes into existence in a random, stochastic manner.
The anatomical structure of our model consists of two input “neurons”, hundreds up to five thousands of hidden-layer “neurons” and one output “neuron”.
The proper process is based on iteration, i.e., mathematical operation governed by a set of rules, in which repetition helps to approximate the desired result.
Each iteration begins with data being introduced into the input layer to be processed in accordance with a particular algorithm in the hidden layer; it then continues with the computation of certain as yet very crude configurations of images regulated by a genetic code, and ends up with the selection of 10% of the most accomplished “offspring”. The next iteration begins with the application of these new, most successful variants of the results, i.é., descendants in the continued process of image perfection. The ever new variants (descendants) of the genetic algorithm are always generated randomly. The determinist rule then only requires the choice of 10% of all the variants available (in our case 20 optimal variants out of 200).
The stochastic model is marked by a number of characteristics, e.g., the initial conditions are determined by different data dispersion variance, the evolution of the network organisation is controlled by genetic rules of a purely stochastic nature; Gaussian distribution noise proved to be the best “organiser”.
Another analogy between artificial networks and neuronal structures lies in the use of time in network algorithms.
For that reason, we gave our networks organisation a kind of temporal development, i.e., rather than being instantaneous; the connection between the artificial elements and neurons consumes certain units of time per one synapse or, better to say, per one contact between the preceding and subsequent neurons.
The latency of neurons, natural and artificial alike, is very importaiit as it
enables feedback action.
Our network becomes organised under the effect of considerable noise. Then, however, the amount of noise must subside. However, if the network evolution gets stuek in the local minimum, the amount of noise has to be inereased again. While this will make the network organisation waver, it will also inerease the likelihood that the erisis in the local minimum will abate and improve substantially the state of the network in its self-organisation.
Our system allows for constant state-of-the-network reading by ineaiis of establishing the network energy level, i.e., basically ascertaining progression of the network’s rate of success in self-organisation. This is the principal parameter for the detection of any jam in the local minimum. It is a piece of input information for the formator algorithm which regulates the level of noise in the system.