Patterns of Emergence: Abstractions in Chaos Theory
A butterfly in Brazil flaps its wings, eventually causing a tornado to occur in Texas. If this phrase does not sound at all familiar, I’m sure you’ve at least heard of the term butterfly effect. The idea emphasizes the sensitive dependence of initial states; how a seemingly trivial condition can lead to a chain of events, magnifying the discrepancies between time itself. Noticeably, the well-known butterfly effect often outshines its origin: chaos theory. I think it’s time we turn some attention towards the director.
Chaos theory is by far one of the most interesting theories in the field of mathematics. For those that are unfamiliar, chaos theory reveals that there is order underlying chaos, and patterns underlying unpredictability. Its concepts rely on mathematically based methods of determining the behavior of non-linear and dynamical (evolving) systems. Chaos theory is a vast topic of study, and its details can get complex and extend way beyond the scope of my understanding. However, what this article will be focusing on is the bigger picture behind chaos theory and how its applications can branch the fascinating study of patterns underlying our sophisticated universe.
So, what exactly defines chaos theory? To understand, we must first clarify a few definitions and establish the concept of chaotic determinism. When a system is said to be chaotic, it exhibits irregular, aperiodic, and unpredictable behavior. When a system is deterministic, its future behavior follows a unique evolution that leaves no room for “randomness” and thus, is restricted to a certain outcome.
At first glance, a system described to be both chaotic and deterministic seems somewhat paradoxical. How can something unpredictable and sporadic not be subject to randomness? Randomness is the absence of any pattern whatsoever, it signifies a stochastic system with no correlation between variables and is therefore, not deterministic. Chaos, on the other hand, can exhibit some form of correlation between its variables. It is deterministic because the initial and current states entirely determine the future state, leaving no odds for randomness. Chaotic behavior is unpredictable but nevertheless can retain an underlying pattern that is deterministic. In short, chaos is subject to a certain outcome whereas randomness is not.
The entirety of chaos theory can be reduced to one statement: simplicity and order can spawn complexity. This means that underlying chaos, there is a specific, although unpredictable pattern. Fascinatingly, this impression is found everywhere in the natural world we live in. Take, for example, the intricate design of a Romanesco broccoli. Clearly, there is a determined pattern to it; we can have an idea of what the next sprout will look like, yet the exact configuration can never be predicted.
The Mandelbrot Set, for example, reveals to us that chaotic systems can emerge through iterations (repeated processes) based on simple functions. What’s intriguing is that underneath the chaotic behavior lies a universal ratio known as the Feigenbaum constant. This number approximates to 4.66 and is based upon ratios calculated from bifurcation diagrams. In the real world, these chaotic systems can describe patterns of a population growth of rabbits, the frequency of water droplets leaking from a faucet, or perhaps even the neuronal firing patterns in our brain.
Attractors are regions of phase space that where outcomes eventually seem to project towards. Attractors describe a system/set of values tending towards a unique behavior. Outcomes of the set can either converge to a singular point, or stay within a certain area, demonstrating a certain pattern of movement. More specifically, strange (chaotic) attractors, describe the trajectory of chaos.
“Unlike the randomness generated by a system with many variables, chaos has its own pattern, a peculiar kind of order. This pattern is known whimsically as a strange attractor, because the chaotic system seems to be strangely attracted to an ideal behavior.” — Gary Taubes, Discover, May 1989
In a strange attractor, there are infinite trajectories in a finite space within a non-integer dimension, meaning that they do not lie within one-dimensional, two-dimensional, or three-dimensional space but instead somewhere in between, called a fractal dimension. Shown below is the Lorenz Attractor. As we can see, there is an apparent pattern, yet the values underlying the configuration of this pattern are actually completely chaotic.
Furthermore, these attractors cannot predict how one state evolves; what it can do is find the general pattern within a collection of the individual trajectories. This emphasizes the importance of the recognition of pattern as an ensemble of elements, rather than the individual elements alone.
One way to understand the concept of a strange attractor is to imagine a system that is initially in a state of instability or chaos, such as the weather or the motion of a pendulum that is being driven by a series of unpredictable forces. As the system evolves over time, it may appear to settle into a repetitive pattern or trajectory, but this pattern is never exactly the same each time it repeats. Instead, it is characterized by a series of small, random fluctuations that cause the system to diverge from its previous trajectory and explore new regions of the solution space. Over time, these fluctuations can accumulate and produce a complex, non-repeating pattern that appears to be attracted towards a strange attractor.
The general behavior of the system seems consistent. This, however, should not be confused with predictability. The state of chaos is still unpredictable, only that the macroscopic system seems to follow a pattern. Basically, to distinguish the behavior of an individual element is hard, but we can analyze a collection of them to analyze the interdependent behavior of the whole. Although 100% certainty can never be achieved when modeling real-world applications (will be elaborated when I try to talk about quantum states), viewing a collection of matter as an entirety, rather than its individual elements, allows a better understanding of the working system.
Not far from chaos theory lies the strange and intimidating world of quantum mechanics. QM studies the most fundamental and basic units of our universe. It breaks reality down into its simplest form and focuses on the intriguing behaviors of subatomic particles (electrons, positrons, neutrons, etc.) in isolated systems. As a consequence of its meticulousness, QM (although undoubtedly has its useful applications) holds theories that can extend beyond what is needed for what we experience on a day to day basis. It is rather impractical to go about our daily-lives viewing things from a quantum perspective, to wake up and say, “I’m tired, I need a cup of quarks in opposing spin states”. However, many of its concepts are foundational to our understanding of patterns, and the topic of physics can almost never leave a conversation about the universe untouched upon.
To describe a system in QM, any possible state that a particle can be in is referred to as its quantum state. These states are characterized by the Schrödinger wave function, which can describe the probability of finding a particle (within a specific region) at any time in the future if given it’s exact initial states and conditions, indicating that the future state of the atom is entirely probabilistic.
This is where QM and chaos theory seem to encounter one another. What is the relationship between probabilistic outcomes in QM and chaotic determination in chaos theory? When something is probabilistic, randomness is a factor for the predicted outcomes, and is surprisingly highly predictable with percent chances. Whereas mentioned previously, chaotic determinism is not subject to randomness and not predictable. This seems extremely counter-intuitive, but we must remember that the predictability of QM only indicates the probability outcomes of future state of particles, not the actual determined state. The wave function does not specify where an electron is, but rather shows where it is likely to be found. In other words, QM is not certain, but rather highly accurate. And to reiterate (because the terms can get extremely confusing), probabilistic QM is not restricted to a certain outcome and chaos theory is restricted to a certain outcome.
A famous thought experiment in physics known as Laplace’s Demon considered the idea that if an extremely powerful entity were to know the exact initial conditions of every particle of the universe, the future could be entirely predicted. This changed the entire course of how people understood determinism and free will, but there were many crevices in the enigma. These philosophical topics will be saved for another time. For now, let’s consider another example similar to Laplace’s Demon.
Say we were traders who knew a thing or two about quantum physics. If we were to analyze every single position and velocity of every aspect of the current market (not sure what that would even entail), we could maximize our future prediction accuracies, but never actually determine the future market with 100% certainty. We could calculate the probability of the value of a stock 5 years into the future, but never fully certify where exactly that will be.
This is where we come face to face with a significant gap in the study of QM: It can be highly unfeasible to model real-world emergent phenomena. The uncertainty of it can be exhibited by something known as the Heisenberg Uncertainty Principle, we cannot simultaneously measure a particle’s position and momentum with perfect precision, as there is a tradeoff the two. What this means is that it is impossible to measure a particle’s exact initial conditions in both position and velocity perfectly. Therefore, we can kiss our dreams goodbye of someday being able to conquer the market.
Probabilistic QM and deterministic chaos theory eventually spur the same conclusion: the emergent world can be described by mathematical systems, and these systems themselves are the patterns of the universe.
Now that we’ve discussed a bit about chaos theory and quantum mechanics, I will attempt to introduce a rather abstract thought that occurred to me, involving the compression of disorder. By no means is this a valid theory, it may even conflict with physical laws beyond what I am aware of, but is nonetheless interesting and worth discussing.
By definition, entropy is the degree of disorder, or randomness in a system, not to be confused with chaos (which is not random). The term entropy quantifies the degree of “randomness” in a system rather than actually denoting the behavior of the system, so in my proposition, these terms will be compatible with one another.
Entropy can be seen as all the possible combinations that a system can be rearranged while still conserving the same amount of energy. German physicist Rudolf Clausius originated the concept as “energy gone to waste” in the 1850s, and it eventually evolved into the modern idea of the “energy of a substance that is no longer available to perform useful work”. Save this thought, we will come back to this.
The total entropy of the universe is always increasing, and can never decrease naturally. This is the overall concept of the second law of thermodynamics, which states that heat transfers spontaneously (naturally) from high to low, and never the reverse. Atomic patterns at a lower energy state are more stable, making it naturally favorable, so when a system drops to a lower energy state, energy is released. The law states that all energy transfers result in the increase of total entropy in the universe, and every energy transfer results in the loss of some usable energy. *An interesting side note is that one of the most important implications of the second law is that it also indicates the arrow of time- it naturally flows in a way that increases disorder. However, the concept of time will require a few pages of analysis so that will also be saved for another discussion.
But for now we know these facts:
- Entropy of the universe is always increasing.
- Every energy transfer increases the total entropy of the universe.
So how does entropy relate to everything we know about the universe? Our elementary school science class has taught us that everything in the universe is made up of matter. To supplement this statement, the most basic units of matter cannot exist alone to form things, for the lack of a better word. Everything that surrounds us — people, trees, tables, etc. are all bunches of atomic matter that exhibit a certain structure, and its unique atomic order makes all the difference. The only difference between the tip of a pencil and your mother’s wedding ring (if your father wasn’t stingy) is the order of its constituent carbon atoms. (*Different crystal structures of carbon can form graphite or diamond).
The importance of order for matter and energy to exist implies that there must be at least some form of correlation between individual particles for their emergent systems to exist. For example, let’s imagine that every particle in the universe is a lego block. There is always motion to a particle, so let’s just envision a bunch of high-energy lego blocks in motion. A lego block alone is nothing more than a lego block. Only until the lego blocks interact with other blocks to form an order and formation of some sort, can it then create magnificent structures. Similarly, a letter by itself is full of chaotic potential energy and has no meaning (unless it’s I or a) until it assembles into a specific configuration, thus releasing its energy into the linguistic environment to create connotation. For example, the letter h by itself can be quite full of chaotic energy, it’s completely unpredictable what that h can become. Once order comes into play, it can then become a word for the animal horse, a historical villain like Hitler, or it can even become something induced by this article: a headache.
Hence, when the entire universe is viewed at a singular and isolated level, everything must be chaotic. Without order, there is nothing but a space of atoms and molecules floating around aimlessly, contingent specks scattered across space. It is not until those seemingly random particles assemble into certain formations of order, that “something” forms.
So now we have some statements describing matter and energy:
- The order of atoms makes all the difference.
- The most basic building blocks of matter, when isolated, are inherently chaotic.
- All emergent matter maintains some form of structure or order.
- Maintaining structure requires energy.
We’ve placed emphasis on order, so what if we were to take a new approach and view all existing forms of emergent matter as the compression of entropy? It seems as if systems tend to search for order; to strive towards stabilizing its behavior and settling into structure, thereby attempting to maintain a lower state of internal entropy (where external entropy is still increasing).
In other words, all components of the universe seem to be in a breath-work practice with increasing entropy. Inhalations of increasing entropy, exhalations of rhythmic contractions forming matter and releasing energy. This speculation is rather abstract and wishy-washy, but it describes the existence of emergent systems as the compression of entropy itself. The breath of the universe.
The term “negentropy” is used to specify the inverse of entropy, and describes systems becoming more ordered. Going back to the saved thought, if the idea of entropy is “energy gone to waste”, then the opposite must mean “energy put to use”. So can negentropy be the explanation for the formation of matter and the release of energy? With this perspective then, emergent systems are not only the compression of entropy but also the harnessing of chaos. By this, I mean the evolution of disordered systems that ultimately “contracts” to ordered compositions.
To apply this perspective to a real-world standpoint, we can use our cognitive processes as an example. All aspects of cognition — memory, perception, learning, etc. is basically the compression of information. Learning is a process of condensing information we receive to later encode into our memory. This “information”, inherently chaotic, is a plethora of different variables that “set” an internal structuring of our brain chemistry. Likewise, perception is the process of compacting external stimuli to form a comprehensible projection of our experiences.
Let me provide an example situation. Think about everything that happens given the following setting. A vast spectrum of different wavelengths of light are bouncing around, a series of letters and digits present itself upon a display, a gravitational force ties your body down to an uncomfortable chair, thoughts about roasted chicken arise in your mind as your stomach grumbles, all the while some old man is producing a combination of sound waves that hazily makes out something about income statements.
Without proper compression of information happening within the brain, this scene becomes nothing more than a melting pot of pure chaos. Only until the external chaos is translated into order through the brain, you can then realize: I am currently sitting in my financial accounting lecture, and wow, it’s terribly boring.
So by viewing matter, energy, and everything we know as the products of “ordered chaos” we can then get an idea of the way assembled patterns (order) arise in the intricacies of life. This adds an extra dimension to the conclusion mentioned above that “the emergent world can be described by mathematical systems, and these systems themselves are the patterns of the universe”. Now, the “patterns of the universe” can be described with order as well as mathematical derivations.
In that case, order can also arise from chaos. After all, isn’t mathematics itself some form of order for human understanding? Functions, equations, and formulas compress chaotic variables into a single equation to make it make sense. It takes input, which is considered chaotic by itself, and compresses it through a processed equation, or order of events, which later results into an output. Likewise, as individual particles go through an order of events, the external universe increases in entropy, and the rearrangement possibilities and “patterns” of matter increase.
The emphasis of this speculation adds another dimension to chaos theory, where not only can order underlie chaos, but order can also emerge from chaos. The question then becomes, is there a way to reverse the process and model underlying patterns of these complex systems in our universe? If this theory is true, everything that is part of the universe, such as consciousness, must exhibit some underlying pattern that we can map.
If you’ve read my first article, we can extrapolate this new standpoint upon our understanding of neural networks. Rather than reengineering a neuron to neuron basis of the biological brain, we can focus on mapping out the macroscopic patterns of the larger (emergent) working system. The top down approach (order emerging from chaos) would help the biological studies of cognition, a reverse engineering of the brain to find underlying patterns that are present in a chaotic neural system. A bottom up initiative (order underlying chaos) would be suitable for the technological side of things, a build-up of high-entropy systems starting with a simple pattern that eventually iterates into a chaotic system. Both of these involve the same concepts of the transmutation of order and chaos, whether the states transition from order to chaos, or vice versa. Fields such as chaos computing are already emergent, where advanced algorithms allow the use of chaotic systems for computational purposes. See? Science is pretty cool.
Phew, that was a lot of information. To sum up everything we’ve unpacked into two words: patterns, man. And if you’re at all confused by all the concepts, you have good reason to be. Many of the mathematical theories we have today describing order, chaos, probability, predictability and determinability go against our basic intuitions. Physicists and mathematicians all over the world today are still intensely learning and improving our knowledge about reality. As they should be. The deeper we dive, the more we can appreciate the depths of our universe.