The mechanics of chaos: a primer for the human mind
The word 'chaos' is usually taken to mean a state of disorder. In the current context however it represents a complex, dynamic system in which its elements, subsystems in their own right, display a degree of variance while at the same time adhering to an overall pattern.
In a more mathematical sense a useful introduction is provided in the article "Chaos theory" on Wikipedia . For the purposes of explaining the mind this text refers to the interactions between the elements of a complex, dynamic system in the real world, as well as focusing on one that features neurons as their functional components.
The mechanics of chaos
First a few words about the meaning of the word 'functionality'. It is meant to describe the observed characteristic an object happens to present. For instance, if a rock feels hot to the touch, then we can append the functionality of 'hot' to it while disregarding the precise internal states which are needed to absorb and then radiate a given amount of energy. The internal states with their molecular structures, rates of energy absorption and dispersal and so on, are a matter of the rock's content; for the purpose of the exercise its functionality can be labelled as 'hot' provided we remain at that level of abstraction.
Using 'functionality' in this manner allows us to identify relationships between two or more objects without having to rely on a detailed description of content. When it comes to describing the system of mind, not having to attach labels to its fundamental actions is vital (more of this later).
Let us start with a small example.
Suppose we have a system A that consists of ten functional units, in other words, units that do something, with each one contributing to the overall state of the system.
We also have another system B, again consisting of ten units, with A connected to B such that communication exists between the two, that is, data are transmitted from A to B affecting the latter's units, or elements. Hence we can say there exists a functional relationship between A and B.
Let us assume further that data from A affect the elements in B such that each one of them enters another state. For this to happen there has to be an affinity between A and B, or more precisely, there has to be an affinity between the elements of A and their equivalents in B. Since we assume all ten are affected, their affinity can be pegged at 100%.
We now add another system C, with a connection between A and C, and B and C. If data from A to C affect only five elements in C, then we can set the mutual affinity between A and C at 50%. Let us also assume the affinity between B and C is once again 100%. Then, with data flowing from A to B, from A to C, and from B to C, the resultant affinity between A and B will be higher than that between A and C. In other words, B will be more affected by A than C, and C will be more affected by B than by A. However, since the particular states of these subsystems are subject to change at any given time and the transmission of data can happen in each direction, despite the initial lesser number of elements in C being modified, the mutual relationship between B and C opens the possibility of firstly, fewer elements in B remaining affinitive with A, secondly, fewer elements in C remaining affinitive with B, and thirdly, more elements in C becoming affinitive with A because their states have been changed through C's relationship with B.
Note that the relationships can be expressed without going into the actual physical characteristics of any of the elements making up A, B and C; we merely observe their cause-and-effect relationships in the functional sense.
If we scale up such a scenario to a level where we have thousands, indeed many millions of objects, comprising not ten but many thousands of elements, then we have a classic complex, dynamic system.
We still do not know exactly what the particular characteristics of all those elements are, but considering the possibilities identified previously in principle, we are able to describe the likely outcomes in terms of probabilities, and since every one of the objects is under the combined influence of its neighbours within the bounds determined by the interactions of various affinity percentages, we can place their respective outcomes within probability envelopes.
The probability envelopes ultimately determine the range within which the state of each subsystem can change. They are being defined by the aggregate mutualities existing within the system, and therefore we have variance, but variance kept in check through the probability envelopes which set the boundaries within which each of the subsystems is allowed to be modified in terms of their affinities.
This is chaos, a quite stable system that nevertheless demonstrates variance as it goes through its paces.
Observing the world in terms of complex, dynamic systems under the auspices of chaos renders much of its phenomena accessible to the observer.
To appreciate a species for its inherent stability, so that an oak tree looks the same for thousands of years while at the same time no two leaves are exactly identical, yet at the same time to understand the possibility of change provided some triggers are strong enough to influence their functional neighbours but not more, opens the door to perceptions which otherwise would be hidden, or at the very least be relegated to the realm of conjecture.
The stability of such systems is considerable, as the history of this planet amply demonstrates. On the other hand, compromise a sufficient number of elements and the state of the system can change quite dramatically. In mathematics such a change is called a bi-furcation , but in real life the change is far more multi-faceted. To draw a metaphor: suppose our system is a bag of apples holding, say, ten of them. Under linear rules we remove one apple at a time and still have the 'bag of apples' system until there is only one left. In complex, dynamic systems however we could remove one apple and still have the same system, remove another and perhaps one more and still have our system. But take away one more apple and there may be no 'bag of apples' - the system has entered another state.
This characteristic of complex, dynamic systems has profound implications. Whether the system is a pond, a forest, or any collection of humans, all such systems can experience a sudden collapse once their mutual dynamics can no longer be sustained. The system as a manifestation will still exist, but its nature will have become different and its members will now be in unfamiliar territory.
Since each member represents a system in its own right, they will face a profound upheaval as well. The collapse of entire civilisations, as tragic as that can be, merely follows the age-old rules of chaos.
In the same vein but on a smaller scale, variances within functional clusters can modify their neighbours inside their own wider probability envelopes. The result is evolution.
As mentioned, the consequences are profound.
Complex, dynamics systems exist in a state of interdependency. No systems exist in isolation from each other, although for labelling purposes an arbitrary scope is usually applied. For example, the electrical wiring inside a building can be described as a system, yet is part of the wider power grid. Power stations rely on their own supply of energy, which in turn becomes a part of the wider economy, having now moved away from electricity as such. Economies interact with each other, being part of planet Earth. Then we have the solar system, our galaxy, and so on. Note that for any of their respective elements to interact with each other there have to be the appropriate affinity relationships in place. Wearing warm clothes at home in winter, say, will not affect the city's power grid, but consuming less electricity as a consequence might.
There exists a feedback loop between systems, given their affinity relationships. Relating that back to the aspect of interdependency, subsystems will settle into a certain state which on one hand leads to the stability of the overall system, but on the other leads to significant consequences should such a state be compromised.
Interdependency implies cohesion. Decrease the latter and the interdependency will suffer.
The more complex the system, the more stable it will be due to interdependency, but there is also a higher probability that any one of its members can be compromised because of so many more contact points with neighbouring members. Also, the higher the complexity the wider-ranging any positive and/or negative effects will be for the same reason.
Since information flows between subsystems which can be of a relatively higher or lower complexity than their functional neighbours, a flow from a higher complexity system to a lower one means information becomes compacted, and flows from a lower to a higher complexity region mean raising the significance of the information under the terms of that higher complexity.
It particularly applies to thought structures. For example, an adult explaining something to a child, or a child's expression being interpreted by an adult.
Complexity carries a cost. If cost implies value, primitive systems are of lesser value, although more common. More complex systems, while more costly, are more likely to evolve towards greater richness.
The human mind
Using functionalities to describe a system allows the observer to address the replication of the system aside from its original components, that is, content.
For example, a simple village pump can be duplicated in content, but essentially it will always be a village pump. Understanding it in terms of functionality however allows us to build an impeller-based version, even a chemical pump (think of osmosis).
In the case of the mind, basing our model on functionalities we can move from the neurons and neurotransmitters etc. and transpose the system into a computer program. Several versions have been written .
For general questions regarding the model and its implementation, see FAQs .
The reason why using the concept of functionalities is so important to describe our cognitive states has to do with the role language plays within the framework of human consciousness.
Consider the expression, "The ball is blue". What exactly is meant by "blue"? Humans have learned to say "blue" when pointing to an object after certain light waves have hit the retina and the data have entered the brain to be processed there. Quite apart from the difficulty involved in describing the moment-by-moment states of millions of neurons, even if it were possible to do so the individual electro-chemical characteristics of those neurons (ie, all those subsystems) are just that - individual electro-chemical characteristics. In themselves they do not constitute thought; only their aggregate dynamic state, sensated through consciousness, does. Therefore we literally have no words for that fundamental dynamic as it performs inside our mind, because that layer does not constitute thought which can give rise to words.
In philosophy the idea of 'qualia', the subjective, conscious experience, has engendered debates spanning centuries .
Using functionalities we can by-pass human-based subjectivism and focus on the dynamics of complex systems at the level of their activity, since it is those we are dealing with.
The nature of complex, dynamic systems can best be described under the terms of chaos.
An analysis of some scenario under such terms should proceed from an observation of the event's content, from there to a definition of that content's functionalities, to an analysis of their mutual interactions, and how the results would manifest given the available, previously identified content.
Should the functional patterns remain the same, even predictions under varying conditions are possible. For an example regarding the outcomes in Iraq see "Notes on the Iraq Study Group Report" .
Perhaps it is time to add a fourth law of thermodynamics :
If functional elements of one system interact with functional elements of another system, then the total number of affected functional elements is the same for both systems, ie, they exist in the form of an intersection and there are no additional functional elements.
Since chaos is an inherent feature of complex, dynamic systems, it also teaches us about life itself.
1. "Chaos theory", http://en.wikipedia.org/wiki/Chaos_theory, accessed on internet 23 June 2013.
2. "Bifurcation theory", http://en.wikipedia.org/wiki/Bifurcation_theory, accessed on internet 23 June 2013.
3. M. Wurzinger, AI Programs, On the origin of Mind website, http://www.otoom.net.
4. M. Wurzinger, "FAQs", http://www.otoom.net/faqs.htm.
5. "Qualia", https://en.wikipedia.org/wiki/Qualia, accessed on internet 23 June 2013.
6. M. Wurzinger, "Notes on the Iraq Study Group Report",
7. "Laws of thermodynamics", https://en.wikipedia.org/wiki/Laws_of_thermodynamics, accessed on internet 15 September 2019.