COMPUTING SCIENCE

# The World in a Spin

# The Spin Cycle

The Ising model was rescued from obscurity in the 1930s, when Rudolf Peierls perceived that a two-dimensional array of spins might admit more interesting behavior than the one-dimensional system. Again a simple qualitative argument suggests the reason. Take 10,000 spins, all *up*, and arrange them in a square array. When you flip half of the spins to abolish the magnetization, at least 100 pairs of neighbors must be antiparallel. The energy penalty associated with these opposed spins is larger than it was in the one-dimensional case—perhaps large enough to maintain magnetization in the presence of thermal disruption.

This informal argument is encouraging, and Peierls gave a stronger version, but if you want to understand the Ising model mathematically, you need a way to calculate the probability of any configuration of the spins at any temperature. To see what this entails, it's helpful to work through a miniature example—a two-dimensional Ising array with just four spins arranged in a square. Since each spin has two possible values, the system can take on any of 16 configurations, or states. In each state the magnetization is the number of *up* spins minus the number of *down* spins. Likewise the energy is the number of antiparallel neighbors minus the number of parallel neighbors. Calculating these properties calls for nothing but the simplest arithmetic. But what we want to know is the *probability* of each state at a given temperature, which is harder to determine.

A first step toward calculating probabilities is an exponential function called the Boltzmann weight, defined as *e*^{-H/T}, where *H* is the energy and *T* is the temperature. (If you want to measure *H* and *T* in joules and kelvins, a constant is needed to make the units come out right, but in an abstract model these formalities can be ignored.) The formula for the Boltzmann weight reveals in a general way how the probability varies with temperature and energy. If the temperature is extremely high, then *e*^{-H/T} has a value close to 1 no matter what the energy, and all configurations are about equally likely. At low temperature, on the other hand, small differences in energy produce extreme changes in *e*^{-H/T}, so that only the states of lowest energy are likely to be seen.

The Boltzmann weight is proportional to a state's probability, but to get at the probability itself we need to know something more. The weights have to be *normalized*, so that the probabilities of all possible states add up to 1. In other words, we have to divide the Boltzmann weight for a single state by the sum of the weights for all possible states. This sum is called the partition function, and it plays a crucial role in the Ising model and in other thermodynamic systems.

For the four-spin array, it's no great challenge to compute the partition function. Simply calculate *e*^{-H/T} for each of the 16 states, and add the results. For example, at a temperature of 2, the sum of the Boltzmann weights is 27.05. At the same temperature the weight of the specific state that has all four spins *up* is about 7.39; thus the probability of this state is 0.27. If you observe a four-spin Ising system at *T*=2 long enough, you should see it with all spins *up* a little more than a quarter of the time.

Notice that in order to find the probability of a single state, we have to compute the Boltzmann weights of *all* the states. For a system of four spins, this calculation is easy, but for 40 spins the trillion possible states would challenge the fastest computers, and for 400 spins a brute-force enumeration is unthinkable. Yet it's the larger systems that matter most. Indeed, what we really want to know is the partition function in the thermodynamic limit—as the number of spins tends to infinity.

Calculating the partition function of the two-dimensional model is hard but not impossible. The problem was solved—to much surprise and acclaim—in 1944 by Lars Onsager, a chemist at Yale University. Of course Onsager's method was not direct enumeration; he got his result through a virtuoso performance in operator algebra, which I'll not attempt to explain since my own grasp of it is tenuous at best. (There is a thorough exposition in Martin H. Krieger's book *Constitutions of Matter*, which also reprints two of Onsager's papers.) The final product was an exact expression for the partition function in the thermodynamic limit.

From the partition function flow all the macroscopic, observable properties of the model. In particular, Onsager's theory describes the onset of magnetization at the critical temperature *T _{C}*. The magnetization is equal to (

*T*)

_{C}–T^{1/8}; note that this quantity has two values at any temperature below

*T*but becomes undefined (within the field of real numbers) above

_{C}*T*. In other words, the magnetization vanishes at the critical temperature.

_{C}EMAIL TO A FRIEND :

**Of Possible Interest**

**Feature Article**: In Defense of Pure Mathematics

**Feature Article**: The Statistical Crisis in Science

**Computing Science**: Clarity in Climate Modeling