**W**hile investigating digestion, Mario
Markus of the Max Planck Institute for Nutrition discovered beauty
in chaos. He and his collaborator Benno Hess have studied several
mathematical models in an attempt to simulate how enzymes break
down carbohydrates. By adjusting a pair of parameters, they found
they could make the simulated enzymes behave in either an orderly
or chaotic manner. To illustrate the chaos inherent in the model,
Markus created a series of pictures that provided not only food
for thought but also a feast for the eyes.

The images are based on a formula named after the Russian mathematician Aleksandr M. Lyapunov. The formula generates a single number for any dynamic system. Known as the Lyapunov exponent, the number indicates how chaotically the system is behaving. When the Lyapunov formula is applied to Markus' model, it produces an exponent for each pair of parameters. By representing each pair as a point on a computer screen and assigning colors to each point depending on the value of the exponent, Markus created what I [A.K.Dewdney] call Lyapunov space. In this virtually unexplored territory, order inhabits colored shapes, and chaos lurks in black regions.

Not long after his work appeared in academic journals, Markus rushed several pictures to an art gallery for exhibition. He can hardly be blamed for doing so. The pictures could just as easily have been made by an apprentice to Salvador Dali.

The model developed by Markus is based on a variation of the
so-called logistic formula, the simplest-known formula that
describes a chaotic dynamic system. The formula contains a
variable, *x*, whose value is always somewhere between 0 and
1. It also involves a system parameter, *r*. The formula
can be written:

The arrow indicates that once *r, x*, and 1-*x* have all
been multiplied together, the resulting number becomes the new
value for *x*, that is, it replaces the old value. The
process can be repeated so that the formula continually spews out
new values for *x*.

The resulting sequence holds great fascination for dynamicists,
but what does it all mean? The logistic equation gets its name
from the logistics of animal populations. In the equation, *x*
represents the number of animals in an isolated region divided by
the maximum number that the region could ever be expected to
support. The amount of food available is therefore proportional to
*1-x*. In other words, as the number of animals (*x*) approaches the
maximum (1), the food supply (*1-x*) dwindles to nothing (0). The
parameter *r* expresses the proportionality. It may be thought of as
fecundity of the population. The higher the value of *r*, the more
quickly the population will rebound from any disaster. Strangely
enough, higher values are precisely the ones that lead most
quickly to chaotic populations.

Although the equation is too simple to represent real animal populations, it can serve as a rough approximation to population dynamics.

If the parameter *r* is less than 2, the sequence of numbers
produced quickly homes in on a single value. It makes no
difference what number the formula uses for *x* initially.
The population always converges to a stable value. In the jargon
of chaos theory, a system whose dynamics stabilize at a single
value is said to have a one-point attractor.

If the parameter *r* is greater than 2 but less than about
2.45, the logistic formula generates numbers that eventually
alternate between two values. The system then converges on a
two-point attractor. In some sense, when fecundity is high, the
population pays a price: its size fluctuates over time.

If the fecundity factory is cranked up to values greater than
2.45, the logistic formula produces numbers that converge on a
four-point attractor. Still higher values of *r* lead very
quickly to eight-point attractors, then 16-point ones and so on.
But if the value of *r* is greater than 3.57 (or, to be more
precise, 3.56994571869), chaos reigns.

At that level of fecundity, the formula seems to generate values at random, though, to be sure, the formula is deterministic. The reason for this strange behavior lies in the attractor. It happens to be a one-dimensional fractal. Like all fractals, it is self-similar: when any small piece of it is magnified, the enlarged region looks very much like the whole.

The fate of the hypothetical populations is clearly portrayed in
this graph. The diagram is produced
by plotting *r* against the
values to which the logistic formula converges. The result is a
kind of tree. One-point attractors make up the trunk; two-point
attractors, the first pair of branches. At an *r* value of 3.57, the
onset of chaos can be seen clearly: the branches suddenly
proliferate.

Chaos can be characterized using the Lyapunov formula. For each dynamic system, the formula produces a single number, the Lyapunov exponent. If the number is less than 0, the system is stable. If the number is greater than 0, the system is capable of chaotic behavior.

The Lyapunov formula is complicated, but it can be translated into
a series of simple steps. In the case of the logistic system, we
start with one particular value of *r*. The logistic formula is
iterated, say, 600 times, so that the numbers converge to whatever
attractor is present in the system. After the numbers settle down,
it is safe to compute the lyapunov exponent. The following recipe
outlines the computation:

[note: original example was in Basic or something similar, it's been converted to "C" for this text]

total = 0;

for (n = 1; n <=4000; n++)

{

*
x = r * x * (1 - x);
*

total += (log(abs(*r-2rx*)) / log 2;

}

lyap = total / 4000;

The algorithm first sets total=0 and then iterates the logistic
formula 4,000 times. On each iteration it computes a new value for
total by adding the old value of total to the logarithm of |*r-2rx*|
divided by the logarithm of 2. (The vertical bars indicate the
absolute value of *r-2rx*.) The quantity |*r-2rx*| represents the rate
at which the magnitude of successive values is growing or
shrinking. When is has added up all 4,000 logarithms, the
algorithm divides the sum by 4,000. The result, which has been
assigned to the variable lyap above, is something like an average
logarithm of change. The result closely approximates the lyapunov
exponent.

Readers who demand precision can more accurately estimate the lyapunov exponent by increasing the number of iterations and at the end of the procedure, by dividing the sum logarithms by the number of iterations.

I encourage readers to use the algorithm above to calculate the
lyapunov exponent for *r* equal to 2. Then compare the result with
that obtained when *r* equals 3. The first number should be
negative, indicating a stable system, and the second number should
be positive, a warning of chaos.

The pictures accompanying this article are all based on the
logistic equation. Markus merely adds one twist of his own. To
produce his pictures, Markus used periodic forcing. This means
that *r* systematically changes its value, alternating between two
fixed numbers, *a* and *b*. In other words, the logistic equation is
iterated with r values of a then *b, a, b, a, b* and so on. The
resulting system may or may not develop chaotic behavior. The
issue can be settled only by calculating the Lyapunov exponent.

For that reason, Markus plotted the value of the exponent for each
possible combination of *a* and *b* values. A two dimensional image in
Lyapunov space emerges when the points (*a,b*) are colored in some
consistent fashion. Markus assigned black to all points (*a,b*) that
produce a non-negative value for the Lyapunov exponent. Chaos is
black. Shades of a single color, such as yellow, appear everywhere
else in Lyapunov space. From a Lyapunov exponent of zero down to
minus infinity, the shade ranges continuously from light to
dark. At zero, there is an abrupt discontinuity as the color
suddenly turns from bright yellow to black.

The resulting images are maps of chaos in the forced logistic
system. In particular,
this map depicts the straight
forward system just described. The parameter *r* alternates in
completely regular fashion between *a* and *b*.

The crossing point of two or more spikes in any of the
accompanying images reveals the coexistence of periodic
attractors. This means that at a point(*a,b*) where such a crossing
occurs, the corresponding dynamic system in which *r* alternates
between *a* and *b* will have two attractors. Which attractor operates
depends, strangely enough, on the initial value that one chooses
for *x* before iteration.

If the Lyapunov exponent is plotted for a succession of initial *x*
values, it may take on a specific value, say 0.015 for a number of
these initial values. Then the exponent amy suddenly switch to
another value, 0.142, to which it may stick for several more
successive initial values before reverting to the first value. the
switching back and forth can become quite frequent.

Lyapunov space often contains darkish bands that run along the spikes. These represent superstable regions in which the underlying forced logistic systems exhibit the most regular behavior.

The Lyapunov space generated by alternating *a* and *b* values
contains a tiny fleck that resembles a swallow. The fleck is
enlarged in this illustration. There, just off the
swallow's tail lies another little fleck. Readers are free to
guess just what it might turn out to be when it is similarly
enlarged.

The appearance of self-similarity in the figure should not surprise students of chaos. Structures that exhibit self-similarity are more often than not produced by chaotic processes.

The methods used to create the mother swallow and its offspring
can be varied slightly to generate a host of different
creatures. The images on these pages differ only in the *a* and *b*
value sequences that were used. For example, a jellyfish - the
yellow tentacled blob shown here - is spawned from
a sequence that begins *b, b, a, b, a* and that repeats over and
over again.

This scene resembles the cover of a science fiction magazine
from the 1950s. I call it Zircon Zity because it is obviously the
futuristic metropolis of the Zirconites (whoever they are). The
underlying sequence of the zity is *bbbbbbaaaaaa*. By repeating this
sequence while calculating the Lyapunov exponent, a computer can
build the zity with all its delicate bridges, golden spaceships
and interplanetary walkways.

What does all this have to do with enzymes, carbohydrates and nutrition? At best, a small region in some Lyapunov map might actually describe the dynamics of enzymes breaking down carbohydrates. But perhaps more to the point, Markus's work makes it possible to visualize the dynamics of periodic forcing. One might say he has made chaos easier to digest.

FURTHER READING

Chaos in maps with continuous and discontinuous maxima. Mario
Markus in computers in physics, pages 481-493;September/October 1990.

The Magic Machine: A Handbook of Computer
Sorcery. A. K. Dewdney. W. H. Freeman and company, 1990.