Fractals, Chaos, and Self-Organized Criticality in Geography

Donald L. Turcotte
Department of Geological Sciences, Cornell University, Ithaca, NY 14853, USA

Abstract
The concept of fractals was introduced by Benoit Mandelbrot in terms of the length of a rocky coastline, the length has a fractional power-law dependence on the length of the measuring rod used. An equivalent fractal method is to use box counting, the number of boxes of various sizes required to cover the coastline is determined and a fractional power-law dependence of the number of boxes on the box size is found. The simplest application of fractals is to plot the cumulative number of objects larger than a specified size against the size. In many cases, fragments, earthquakes, faults, mineral deposits, oil fields, etc. a fractional power-law (fractal) dependence is found. The rational for the wide applicability of fractals is that the power-law is the only statistical distribution that is scale invariant.

Drainage networks have long been recognised as being fractal trees through the validity of Horton's laws. An improved taxonomy of fractal trees has been developed utilising a matrix classification of side branching statistics. Computational software is available to directly obtain the matrix coefficients from the digitised topography of a drainage basin. This taxonomy has also been applied to computer models of cluster formation such as diffusion-limited aggregation (DLA).

A second class of fractals with wide applicability are self-affine fractals. Applications are generally to time series such as topography and river flows. Fractal time series have a power-law dependence of the power spectral density S on frequency f, S~f ,ß=0 is white noise, ß=1 is red noise, and ß=2 is brown noise (a Brownian walk). Topography generally has a ß near 2. Fractional Gaussian noises have -l<ß<1, they are generally treated using Hurst's rescaled range analysis. For a white noise ,=0 the Hurst exponent is Hu=0.5; for -1<ß<0 one obtains 0<Hu<0.5 and the time series is antipersistent; for 0<ß<1 one obtains 0.5<Hu<1.0 and the time series is persistent. For many natural phenomena including climate, rainfall, river flows, tree rings, varves, and sunspots it is found that Hu»0.75 and ß»0.5. Chaos is generally associated with low-order systems, either recursive relations such as the logistic map or sets of nonlinear equations such as the Lorenz equations. While solutions illustrate chaotic behaviour and exhibit a variety of fractal statistics they have limited practical applicability. Slider-block models provide a bridge between low-order systems that exhibit classic deterministic chaos and high-order systems that exhibit self-organised criticality. A pair of interacting slider blocks are a classic example of a low-order chaotic system. A large number of interacting slider blocks behave like the sand-pile model that Per Bak used to introduce the concept of self-organised criticality.

The forest-fire model is a simple cellular-automata model that exhibits self-organised critical behaviour. In the simplest formulation of this model, a two-dimensional square grid of points is considered. At each time step, either a tree is planted on a random grid point or a match is dropped on a random grid point. If the sparking frequency f=1000, 999 trees are planted before a match is dropped. If a match is dropped on a grid point occupied by a tree then that tree burns and all adjacent trees burn. The output is the statistical distribution of forest-fire sizes. If the sparking frequency is relatively small a fractal distribution is found with a maximum cutoff. If the sparking frequency is very large the entire forest will burn every time a match is dropped. Actual forest fire statistics seem to be in agreement with this simple model and a variety of other applications have been made including epidemics.

The concepts discussed above lead naturally to the renormalization group approach to the numerical treatment of complex problems in geography. A relatively small number of first-order elements are considered at the smallest scale, these elements constitute a first-order cell and results are averaged to give the behaviour of the cell. The first-order cells become second-order elements and the process is repeated. This is the renormalization group approach which has wide applicability to complex natural problems.