# Chaos

Nietzsche, in the premise of Thus Spoke Zarathustra, stated Eternal Return, an idea that all things in the universe will recur and all events in one’s life will repeat again, and again, ad infinitum1. Nietzsche’s world was a deterministic one, Newtonian, with tangible objects, and countable sets. Hence, he viewed his universe like a deck of cards, bound to return to a particular shuffle infinitely on infinite shuffles. But all recurrences need not be perfectly self similar. Far older eyes have imagined universes in kalpas, forever repeating but never quite exactly while the current scientific models view them all in parallel, where each universe just slightly differs from the other. How then are such universes made possible?

This is the third post in the series and 3 is linked to chaos2. However, stepping back a little, I had forever wondered that with all the triumphs in science, with technology to reach the moon and back, predict the spin of an electron correctly to 14 decimal places, why couldn’t we predict if it were to rain on an afternoon? This had made me cognizant of large gaps between some areas of applied & theoretical sciences. Then one day, I stumbled upon a lecture that argued chaos being is a prime example where reduction failed. I then started noticing concepts of chaos emerging everywhere (more like RAS at work than synchronicities), finally coming across some recent studies in chaotic models of the universe, which did it for me. From then on, I needed to know more. Hence, this blog is my meditations in chaos. We will start out exploring what it means for things to be chaotic. We will test conventional methods to predict chaos, note their limitations and finally attempt to see if a class of neural networks with memory based learning units such as LSTMs can help predict behavior of some simple chaotic systems.

### Theory

Chaos can be summarized as a aperiodic long term behavior of a deterministic system with sensitive dependence to initial conditions3. Aperiodicity implies lack of rigid temporal structure while sensitivity to initial conditions implies that any small changes in the earlier states of the system will result in exponentially large differences in the later states. Finally, deterministic implies no randomness is allowed in inputs. We will scrutinize each feature from the lens of four 1-D functions below:

$$y = sin(\omega t), t \in \mathbf{R}, \omega \in \mathbf{R^+}, \hspace{10mm} \text{(1)}$$$$x_n = Fib(n) \text{ mod } p, n \in \mathbf{N}, p \text{ is prime} \hspace{10mm} \text{(2)}$$$$x_{n+1} = \lambda x_{n} (1-x_n), n \in \mathbf{N}, \lambda \in \mathbf{R^+} \hspace{10mm} \text{(3)}$$$$x_n = Digit_{\pi}(n), n \in \mathbf{N} \hspace{10mm} \text{(4)}$$

Eq 1 is the sine function. Lets set the frequency $f = \omega / 2\pi = 0.04$ for the upcoming experiments. Eq 2 gives the $n^{th}$ Fibonacci number modulus a prime number, let $p = 113$. Eq 3 is the logistic map and we let $\lambda = 4$ and the initial value $x_0 = 0.2$. Eq 4 is an unknown function that prints out the $n^{th}$ digits of Pi for a given $n$. Eq 1 is a textbook periodic while Eq 3 with $\lambda = 4$ is a textbook chaotic function. We don’t know whether Eq 2 and Eq 3 are chaotic. The plot of $f(x) \text{ vs } x$ for the 4 equations can be seen below. Note that we will use the term map and function interchangeably in this blog treating discrete iterations as continuous time in some of the plots below. While we are incorrect in a strict mathematical sense, I believe this approach is justified when our analysis tools are all computational and our ethos mostly experimental.

The Logistic map (Eq 3) turns out to be a simple model for many real-life dynamics such as population growth models, predator-prey dynamics, Josheption junctions, etc4. Chaos can be observed in the Logistic map when we plot the roots wrt parameter $\lambda$. For example, let $R_{\lambda}$ be the solutions to $\lambda x_{n}(1-x_{n})=0$. $R_{\lambda} \text{ vs } \lambda$ for $0 \le \lambda \le 4$ is plotted. One can note that the function converges to one root $x^* \text{ for } 0 \le \lambda \le 3$ while the solutions bifurcate to 2 roots in the range $3 \le \lambda \le 3.45$ and so on. At $\lambda \ge 3.57$, we observe onset of chaos and emergence of many unstable roots4. From Fig 2 its clear that a simple function such as Eq 3 can exhibit highly complex behavior.

### Summary

The LSTM networks were successful at signal reconstruction and multi-step predictions for Eq 1-3. For Eq 3, which is the Logistic map with $\lambda = 4$, the prediction horizon of the network was about 7 iterations into the future before the test error became $\gt 0.1$. For the LSTM training error of 1e-3, this is consistent with the Lyapunov time we computed for the initial sensitivity of 1e-3. We were also successful at reconstructing the Logistic map with small added noise with a reduced prediction horizon of 4 iterations into the future. Compared to other conventional methods discussed in the Experiment section, we note that the LSTM was robust for its multi-step prediction. For Eq 4, the network overfitted the data and failed to generalize. However, looking at the dense Poincaré plot of Eq 4 in Fig 6, this was sort of expected.

I hope by now we look at Chaos in a different light. I prefer the word complexity for the word chaos itself has a negative connotation. It could be an artifact of Newtonian determinism mixed with the lack of algebraic techniques to study and predict, but I find physicists hating the concepts of chaos, doing everything to avoid them in the textbooks while engineers trying hard to control with limited success. This limitation just shows the complexity of the dragons we are dealing with, which should be exciting and taken seriously. For what was magic of yesterday is today’s science, and while the aether may have been abolished, the magic still persists in earth, air, water and fire; all with varying degree of chaotic dynamics. And so the challenges of climate change, space travel, geoengineering, nuclear fusion, beating cancer or studying the origins of life, all still seem out of reach. But 21st century science steps up.

### References

1. Notes on the Eternal Recurrence – Vol. 16 of Oscar Levy Edition of Nietzsche’s Complete Works.
2. Period Three Implies Chaos, Tien-Yien Li and James A. Yorke, The American Mathematical Monthly Vol. 82, No. 10 (Dec., 1975), pp. 985-992
3. http://farside.ph.utexas.edu/teaching/329/lectures/node57.html
4. Strogatz, Steven H. 1994. Nonlinear dynamics and Chaos: with applications to physics, biology, chemistry, and engineering. Perseus Books (p. 21, 371, 367) 1994.
5. Logistic Map, http://www.physics.drexel.edu/~bob/PHYS750_NLD/ch2.pdf (p. 22)
6. Grunwald, Peter. “Shannon Information and Kolmogorov Complexity.” ArXiv.Org, 1 Oct. 2004, arxiv.org/abs/cs/0410002.
7. F.V. Jensen (2001), Bayesian Networks and Decision Graphs, Springer (p. 69)
8. https://colah.github.io/posts/2015-08-Understanding-LSTMs/’
9. https://machinelearningmastery.com/time-series-prediction-lstm-recurrent-neural-networks-python-keras/

## 1 Comment

1. June 25, 2020

Excellent overview of chaos. The graphics and examples work well to describe this difficult concept. I find it refreshing to embrace chaos as an integral aspect of the universe. Respecting the natural world for its complexities instead of shying from the challenge is a critical first step in achieving a better understanding.