Current Issue

This Article From Issue

March-April 2004

Volume 92, Number 2

Phase Change: The Computer Revolution in Science and Mathematics. Douglas S. Robertson. xiv + 190 pp. Oxford University Press, 2003. $29.95.


Since 1962, when Thomas Kuhn wrote The Structure of Scientific Revolutions, the evolution of science has been conceived of as a series of paradigm shifts—intellectual shake-ups in which one worldview is replaced by another. The story of a paradigm shift is usually told with a focus on its chief characters, the outstanding scientists who instigate a scientific revolution.

We all know about Nicolaus Copernicus and Galileo Galilei and how they revolutionized our view of the universe, but most people have never heard of Aristarchus of Samos, the Greek philosopher who proposed the heliocentric model more than 1,700 years before Copernicus. Aristarchus's ideas were rejected because they didn't concur with basic sense experience. Even Copernicus's great work De Revolutionibus didn't shatter astronomy when it was published in 1543; it didn't even provoke the Holy Inquisition. No paradigm shift occurred until 1610, when Galileo used his new telescope to discover the rotation of the Sun, the phases of Venus and four moons orbiting Jupiter—things that fit well into the Copernican universe but not an Earth-centered one. Galileo's observations, combined with his talent for advertising them, finally triggered the Copernican revolution. It was the telescope that distinguished him from Aristarchus and Copernicus, by allowing him to see new things—things that nobody else could even anticipate.

In Phase Change: The Computer Revolution in Science and Mathematics, Douglas Robertson maintains that this is a general pattern: Paradigm shifts have often been preceded by "a technological or conceptual invention that gave us a novel ability to see things that could not be seen before." The effects of such an invention, he believes, fit the definition of a phase change. He borrows the term from physics—where it describes the dramatic transformation of matter as a parameter is slightly changed (a familiar example is liquid water turning to ice as the temperature is cooled below the freezing point)—and uses it to refer to "radical improvements in our ability to see things." The book presents many examples of such improvements, including the microscope, the seismograph, Alessandro Volta's battery and the geophysical instrumentation that confirmed Alfred Wegener's theory of continental drift.

Ad Right

Robertson's main objective is to get the reader to appreciate the scientific revolution that is currently being triggered by a modern instrument: the computer. According to him, the digital revolution is on a par with all previous upheavals in science, and the information explosion it has generated is bringing about phase changes in almost every field.

From a practical point of view, doing science means operating instruments, recording and analyzing data, and making calculations. The computer boosts all these activities, so it's no surprise that digitization affects all areas of science. What is surprising is the magnitude of the digital upheaval in each field. The angular resolution of the naked eye is about one arcminute; the telescope improved this to about one arcsecond, the limiting resolution that is possible within the variable refractivity of the Earth's atmosphere. But with the use of a computer, the varying refractivity can be measured and corrections made in real time. Viewed through such an "adaptive" telescope, the stars no longer twinkle, and the angular resolution reaches a hundredth of an arcsecond. So, measured in terms of angular resolution, adaptive optics is a larger step forward than the original invention of the telescope. And of course there are many more ways in which astronomy and other fields benefit from the computer.

Reconstructing the internal structure of the Earth from the seismic signals we receive at the surface is a task that requires an enormous number of calculations. Scientists from the precomputer age could get only a coarse-grained picture of our planet's interior, but with increasingly speedy computers, models have become feasible that capture its structure in great detail. That's the general pattern: Once, theoretical models had to be simple enough to be analyzed using pencil and paper, but in the computer age, model building is usually limited only by ingenuity. Computers make it possible to model the global climate. And because computers can be used to focus the beam in a particle accelerator and to analyze the vast amount of data produced by each collision, experiments can be carried out to verify fundamental theories in particle physics.

"It can be very difficult to comprehend a revolution when you are standing right in the middle of it," Robertson says in his introduction. In fact, we use computers every day, but we rarely stop to marvel at the power they give us. The computer is to the naked mind what the telescope is to the naked eye, and it may well be that future generations will consider all precomputer science to be as primitive as pretelescopic astronomy.—Stephan Mertens, Theoretical Physics, Otto-von-Guericke University, Magdeburg, Germany

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.