Science in 2006, Revisited

From grid computing to genomics, the science fiction of 1986 is fast becoming science fact. There remains equal reward in the signal and in the noise.

Anthropology Biology Chemistry Physics Technology

Current Issue

This Article From Issue

May-June 2003

Volume 91, Number 3
Page 250

DOI: 10.1511/2003.44.250

Seventeen years ago Sigma Xi celebrated its centenary and reflected on a century of science and engineering research. I had the honor and good fortune to serve as the Society's president and to participate in a discussion of how American Scientist might celebrate this auspicious occasion.

Someone suggested that we prepare papers predicting the future course of science. "Nonsense," said a very level-headed board member. "Nobody can predict the course of science. Everyone will criticize any prediction we make. Besides, that's what makes science so exciting."

Ad Right

"Scientists can't predict the future of science," I said, "but science fiction writers do it every day, and often with surprising perspicacity." So I volunteered to write a piece of science fiction with the dateline of 2006, 20 years into the future, imagining how two decades of change and growth in science might look from that vantage point.

Cloudy Crystal Balls

It's not quite 2006 yet, so I have three years for some of my predictions to come true. But how well did I do? The short answer is that almost all my specific predictions have already turned out to be simply wrong. I predicted the successful construction of the Superconducting Supercollider and predicted its successor would be under design by 2006—a tunnel around Antarctica. In honor of the International Cosmological Year, the new machine would have the acronym ICY. I predicted that T. Boone Pickens would endow the Santa Fe Institute with enough money to become the department-less graduate school for interdisciplinary science (the Pickens Institute for Science, or PIS) that Murray Gell-Mann had dreamed of in the 1970s.

Tom Dunne

One of my more regrettably bad predictions was the idea that the importance of complexity, irreversibility and nonlinearity in science would bring about a resurgence of mathematics. In the United States, that has not happened. Although the mathematicians and the theoretical physicists are working together again, the U.S. government has certainly continued to starve mathematics and has done little to encourage a new generation of American mathematicians. In this critical field we not only import our students; we must import faculty as well.

Another disappointment is in education. One could foresee in 1986 a massive shift from focus on teaching to focus on learning, especially as the cognitive sciences made such good progress. But alas, between parents who don't care, schools that can't function, and politicians who sell clichés but are unwilling to address the basic issues, again in the U.S. education at the pre-college level still struggles in a swamp of neglect and ideological determinism.

Clear Crystal Balls

In a few cases I made attempts to be so extravagant that my forecast would be seen as tongue-in-cheek—and therefore turned out to be about right for 2003. I foresaw teraflop computing (trillions of calculations per second) available on the desktop. The development of new distributed operating systems that allow large numbers of computers to share distributed data (called grid computing) and to share the computing power of machines that are momentarily idle joined with Moore's Law (that processors double in speed every three years) to make this possible. I correctly forecast the mapping of the human genome in the 1990s, and predicted that you would be able to buy it on a CD-ROM for $9.99. Most important, I predicted the growth of the Internet and its impact on science. (In 1986 Steve Wolff came to the National Science Foundation and launched NSFnet using TCP/IP, the key protocols that permitted the explosive growth of the Internet later. Looking back, it is hard to believe that it was not until 1987 that 10,000 Internet hosts existed; now there are hundreds of millions around the world. See timeline at http://www.pbs.org/internet/timeline/) I called it WUNET (an acronym for World University Network and pun on "one net"), making a serious underprediction of the commercialization of the Internet. On one point I was actually too pessimistic (although literally accurate) in predicting that by 2006 automatic language translation would "remain incompletely solved." Finally, I correctly predicted the confusion that would engulf tenure and promotion committees in their attempt to define publication so they could decide who should perish. Even in 1986 it was clear that authors would become publishers, and scientists would not wait to learn the latest research advances until the print materials arrived in the snail mail.

Paul Chizmas and the Pittsburgh Supercomputing Center

But these were details—some of them intended to be funny when looked back upon. The serious part of the prediction about science itself was in one way correct and in another unrealized. All the trends for science were evident in 1986, and I believe most scientists would endorse the observations I made. But most would also say that the majority of science and its institutions have not moved from their traditional self views. Change happens blindingly fast in science, but agonizingly slowly in the institutions of science. Let me summarize the most important trends I forecast.

First, science would become ever more capital-intensive, which itself would drive science down a multidisciplinary, multi-author, shared-resource path. That was a no-brainer; it surely is reality. I did propose a solution to the competition among nations for the location of the very large, shared, science facilities. Every participating nation would be authorized to build a magnificent marble structure, with "World's Largest Accelerator" or "World's Most Farsighted Telescope" engraved over the front door. Inside, there would be of course no accelerator or telescope, only a mammoth bank of computers and satellite dishes through which each nation's scientists operated a machine in a generally unknown place, deep underground or atop an inaccessible mountain. Site selection becomes much easier. This trend is also well under way.

Second, I saw the reintegration of sciences —a hugely important trend that is surely under way but will be far from dominant in the structure of scientific activities in 2006.

This trend can be seen in at least four areas:

—Cognitive science, brain studies, neurophysiology and behavioral science. In these areas we do see a huge effort to bring together these several threads of knowledge, much of it based on faith that surely one day we can give biochemical and physiological understanding to human (and animal) behavior.

Jeff Hester et al., NASA/Chandra X-Ray Observatory Center/Arizona State University

—Cosmology, high-energy physics, astrophysics and mathematics. Here too the prediction is in full flower. Indeed, except for mathematics these disciplines now find themselves in the same department in many universities. Progress has been nothing less than incredible. And the prediction that the estrangement of mathematics from theoretical physics would end has surely proved right.

—Biochemistry, medical sciences, and molecular, cellular and developmental biology. It was not hard, in 1986, to predict this reintegration, given my successful prediction on the progress of genomics. I foresaw the ability to use computer modeling to design and create new molecules with chosen functions. But I was a bit optimistic in seeing the ability of genomics to tell us about the biological locus of instinctive behavior.

—Geophysics, meteorology, oceanography and paleontology. The first three have merged into planetary and earth sciences on the one hand and climate sciences on the other. Indeed, the concern about global climate change and sustainability has accelerated the study of the interrelations of oceans and atmospheres, and paleontology has proved a vital source of information (if ice cores are considered paleontology). I could add geography to the list, given the importance of studies of human habitation, energy use and technology development in the issue of sustainability.

Third, specialization and reintegration still compete. It was easy to foresee the dark side of multidisciplinary studies—the claim scholars might make to mastery of a broad interdisciplinary area without mastering any of its constituent disciplines. This would make peer review and tenure evaluations very difficult and controversial. One desperate hope was of course doomed to failure—my dream that the National Science Foundation and National Institutes of Health would stop trying to predict the work that deserved funding, and instead reward those who proved their work was worthwhile. For mature scientists that should be easy. For young scientists I proposed that with every grant to a mature scientist (based on her record) the university would receive an additional 25 percent to be used for funding young scientists within three years of a Ph.D. The universities would choose the awardees. Back in the 1980s when I chaired the National Science Board, I had already pushed for an additional mechanism: grants to be made to young investigators by program officers, made without peer review. The work of the awardees would be reviewed three years later and the rating put in the program officer's performance file.

University of Southampton Neurosciences Group, U.K.

I proposed that an idea Herb Simon espoused back in 1985 would be widely adopted in universities. The "Simon Standard" was quite simple: "No one was allowed to publish pan-disciplinary pronouncements until they had published at least one solid paper in each of the disciplines drawn upon" (my words, not his). I used Picasso as my model. When he was a teenager he showed he could paint like Leonardo; he earned the right to represent a bill with five lines on a piece of paper (which would sell later for millions). Under this standard, the mono-disciplinary departments would survive as keepers of the tools and standards in specialized areas of science.

Fourth, experiment and theory would be come increasingly indistinguishable. On the data side it seems obvious that when the quantities of computer-acquired data explode, one builds algorithms into the computer analysis so the experimenter is not seeing the output from individual sensors but rather a processed flow of data which we should call metadata. The algorithms used to process and simplify the data are themselves based on some theory in which one has confidence. But the result is really not experiment, independent of theory. For example, if you use symmetry properties of viruses in the analysis of x-ray crystallographic images, the result is clearly not independent of those theoretical assumptions. The traditional separation of students into "theorists" and "experimentalists" is no longer tolerable in 2003, much less will it be tolerable in 2006. A similar line of argument applies to the use of computation to exhaustion as a means for "proving" mathematical theorems.

Science and Society

My forecast went only as far as thinking about the potential integration of the social and physical sciences. The main prediction was the growing recognition that research tools for dealing more effectively with the major problems facing societies must be improved. Again putting on the glasses of a scientist in 2006 to acquire her hindsight, I wrote, "Creative intuition is a valuable—even essential—tool for both scientific and artistic progress. In the social sciences, however, intuition had long proved a dangerous trap. It was easier to be objective when man studied nature. Man's study of man is the ultimate challenge. But the challenge had to be faced."

What I did not address in our 2006 retrospective were some truly important issues that are transforming science in many dimensions. Perhaps the most serious oversight was my failure to foresee the highly welcome growth of participation of women in science, not only as students but among business leaders and senior faculty. In those senior posts women are still seriously underrepresented, but the trends are strong and favorable.

My optimism about the growing political support for better environmental stewardship in the U.S. now seems extravagant in view of the current administration's reversal of much of the progress of preceding years. Nor did I anticipate the extent to which science no longer enjoys the degree of insulation from politics it once enjoyed. Science, many would say, has become too important to be left to the scientists. We have many indicators of a new and more complex relationship between science and society: the rise of scientific fraud and new quasi-judicial processes to find and punish it; the insistence by Congress that agencies supporting science document not only the resulting scientific outputs but the outcomes in the form of benefits to society; the rise in earmarks by Congress, diverting billions of dollars from the safeguard of merit review. None of this should come as a surprise, given the enormous growth of biomedical research budgets, but it calls for a new maturity and new sense of accountability on the part of scientists.

Finally, I can hardly be faulted for failing to foresee the rise of catastrophic terrorism, bringing with it a felt need to constrain the flow of basic scientific knowledge to terrorists while still enjoying the fruits of science for medicine, environment and the economy.

Sidney Harris

But the bottom line to this effort at seeing the future of science is that the attraction of science as a life's vocation is unchanged. "In 2006," I wrote, "God still loves the noise as much as the signal. Man is still aware that with every step forward in science, two delicious new questions—crying out for study—were born."

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.