Logo IMG
HOME > PAST ISSUE > Article Detail


A Box of Universe

Watch the cosmos evolve in a cube one billion light-years wide

Brian Hayes

Observation, Theory, Simulation

All of these head-spinning changes in our understanding of the cosmos are anchored in scientific knowledge. They may seem like one wild idea after another, but they are supported by data from observations, by physical theory and by simulations that connect theory with observation.

In the past 20 years astronomy has become a data-rich, statistical science. For example, the Sloan Digital Sky Survey has catalogued 500 million objects and recorded almost two million spectra. The spectra allow measurements of redshifts and thus of distance along the line of sight. The data yield a three-dimensional map that covers about a third of the sky and goes back in time more than a billion years.

Another ongoing endeavor aims to extract information from the oldest photons in the universe. A satellite called the Wilkinson Microwave Anisotropy Probe (WMAP) has measured tiny spatial variations in the microwave background radiation, giving us a glimpse of the distribution of matter and energy at an early epoch—the initial conditions for the visible universe.

All these data are interpreted in the context of Einstein’s general theory of relativity, which describes the interactions of matter, energy, space and time in an expanding universe. Quantum mechanics also has a role, for example in explaining the spectrum of fluctuations in the microwave background. But these theoretical principles are not enough to predict the shapes and sizes and other properties of the structures that emerge as the universe evolves. Computer simulation is the best tool for this purpose. Starting from plausible initial conditions and known physical laws, we can compare the output of the simulation with astronomical data. Along the way, we get to watch a movie of the universe unfolding.

As early as 1941 the Swedish astronomer Erik Holmberg simulated the clustering of galaxies with a remarkable analog computer he made out of light bulbs and photocells. By the early 1960s digital computers were the instrument of choice. Sverre J. Aarseth of the University of Cambridge studied galaxy clusters having 25 to 100 members.

Modern computing machinery allows much larger models. In 2005 Volker Springel of the Max Planck Institute for Astrophysics in Garching, Germany, and 16 colleagues published the results of a landmark simulation called the Millennium Run. It was based on the ΛCDM model; it included more than 10 billion particles of dark matter in a cubical volume more than 2 billion light years on a side; it covered a span of time from 12 million years after the Big Bang up to the present. Unfortunately, some of the parameters that defined the initial conditions were based on early results from the WMAP satellite mission, which were later significantly revised.

The Bolshoi simulation (described in more detail below) is slightly smaller than the Millennium Run but attains higher resolution in measuring positions, masses and forces. It is also based on the updated WMAP parameters.

comments powered by Disqus


Of Possible Interest

Feature Article: Restoring Depth to Leonardo’s Mona Lisa

Computing Science: Computer Vision and Computer Hallucinations

Spotlight: Making the Cut

Subscribe to American Scientist