Table of contents
The AI matrix
The size and shape of the main matrix is determined at setup. It consists of an inner matrix surrounded by an input and output region, and each node consists of another matrix (the element matrix).
The illustration below represents a 5x4 main matrix (5 rows, 4 columns) with each node a 3x2 element matrix (3 rows, 2 columns), hence expressed as a 5x4x3x2 matrix. Element matrix nodes are integers, their range go from 0 to the High end of element number range chosen during setup.
Input, output and inner matrix nodes are connected according to the setup specifications expressed as a percentage.
Here a Connectivity of 30% is used, which means each node is connected to 2 other nodes (ie, 30% of 100, rounded).
Connection depth refers to the number of levels the traversal across the connections takes.
In this case the Connection depth is set to 1, which means the traversal goes from each node to its two child nodes.
If the depth were set to 2, each one of the child nodes would have further access to two more nodes; and so on.
The processes between input nodes and inner matrix nodes, between inner matrix nodes, and between inner matrix nodes and output nodes are essentially the same. The only difference lies in their respective connectivity: input and output nodes are not connected with each other, only inner matrix nodes are.
No inner matrix node is connected directly to itself, however, depending on the level of connectivity, several inner matrix nodes can be part of the same connection tree.
There is only one processing algorithm (see Further references). Its general purpose is to move one integer (the resident value - Res) towards some other (the reference value - Ref). Res represents one particular element node of a main matrix node, Ref is the equivalent (ie, an element node in the same position of its element matrix) in another, connected, main matrix node. The schema is shown below.
One - isolated - example of what happens to Res is shown below.
In this case the starting values were 3 for Ref and 14 for Res, with the result of every iteration fed back into the formula, thereby producing another value for Res. The graph illustrates those values.
Notice a number of irregular swings before the results settle into an oscillation which repeats itself from then on.
If Ref is seen as an attractor, the outcome is similar to the three types of chaos-based phenomena and so Ref can be a stable attractor (Res eventually converges on Ref), a periodic one (as shown here), and/or a strange attractor (Res adopting a seemingly random set of values).
When the integer values will 'settle down' depends on the numbers themselves; even an increment or decrement of 1 can change the result from one type to another.
Suppose we take a snapshot somewhere at the beginning of the iterations before a pattern has become discernible. At that point the outcome is impossible to tell. Nevertheless, regardless of that earlier phase's appearance, the movement of the numbers do represent the later result, albeit in its latent stage. Since (a), the type of the eventual pattern is a function of the value of the two integers, (b), even a difference of 1 can change the nature of the pattern, and (c), same types of patterns can be found at various number combinations across the chosen element number range during setup, the creation of any particular type of pattern (and their smaller differences, their 'sub-types') can occur at any time, between any of two nodes, anywhere within the number range.
Therefore, when it comes to traversing the inner matrix nodes, during any cycle those patterns will be realised.
As the above illustration of affinity relationships shows, similarities of varying degrees emerge. Their stability depends on what happens among the other nodes - keep in mind that the connectivity coupled with the level of depth traversal ensures any current node states act within a framework of interdependencies.
Following the progress on the screen we observe some nodes maintaining their relationships (ie., they maintain the same colour), some regions change, or certain nodes within a region suddenly become more unique. Significantly, nodes tend to change their states together (let's say their colour changes from some blue to some green, that is they become more dissimilar in relation to the other nodes), a further proof of their interdependency.
In the end it all comes down to the input, in this case from the webcam.
The latency mentioned earlier (ie., not yet visible patterns already influencing outcomes) also plays a role during a change in input.
For example, some input A produces certain regions of patterns, a subsequent input B modifies those patterns (or at least some of them). It is possible that at that stage some further input C triggers the latent states produced earlier on by A. In principle the phenomenon is the same as during the emergence of affinity domains in the first place, only displaced in time.
All this leads to further considerations concerning memory and its recall (see Further references).
The nodes enter representative states in relation to the input and through their affinity relationships produce domains of varying stability. Subsequent input does not change the latency produced by earlier input, so much so that earlier states are re-evoked if some later input is capable of triggering them. Although latency exists, it does not make for an exact replica of previous states.
NOTE: the display of output node states and of the affinity relationships between the inner matrix nodes are just that - a display. The interpretation through the colour scheme does not change the states themselves. For an animation showing the changing nature of the nodes' states see Animation of AI engine states.