MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
RSS
Logo IMG
HOME > PAST ISSUE > Article Detail

FEATURE ARTICLE

Science in 2006

A former IBM chief scientist looks ahead from 1986 into the twenty-first century

Lewis Branscomb

Quantification in the Social Sciences

There was another very important reason why the social sciences began to achieve equal footing with natural science in terms of government support and academic prestige: they had become substantially more quantitative and rooted in robust collections of data. During the difficult years of the early 1980s, when social science budgets were hard hit, the National Science Foundation and private foundations worked hard to sustain the major data collections-especially the longitudinal collections that, once interrupted, could not be rebuilt. With improving analytical methodology and substantial computer support, behavioral scientists began to provide an interesting degree of predictive power.

Since the turn of the century, even the macroeconomists had been publishing a significant number of papers in which their mathematical models and interpretations were tested against credible collections of empirical data. Too bad Vassily Leontieff did not live to see it happen.

So, by 2006, the National Science Foundation had already had a Directorate of Economic and Social Sciences for seventeen years (it was formed in the first year of the administration elected in 1988), and by this same year fully 23% of the members of Sigma Xi were from human and social science disciplines. It had all happened quite naturally.

Post-Handbook Engineering

At the same time that science and even art were becoming more technological, engineering was also changing. It had already, by 1986, largely outgrown its handbook past. It was simply not possible to keep the printed handbooks up to date with the extraordinary progress in the science base for engineering. In materials science, for example, instead of building things with materials in stock, whose known properties were in the handbooks, engineers were increasingly creating new materials especially designed for a specific purpose. Many of these materials were complex composites unknown in nature. The equivalent of the data tables in the handbooks became algorithms and models of design and fabrication processes. For this, the materials engineer and his materials science colleague had to stay abreast of both an extensive scientific literature and a burgeoning body of practical experience with the preparation, performance analysis, and processing of useful new materials.

As a result, doctoral degree programs in materials engineering and science expanded dramatically, combining the skills of the engineer (problem solving and sensitivity to production, quality, and cost factors) with the knowledge of the materials scientist (the uses of ceramic and polymeric substances, interfaces, and synthetic composites). Computer simulation models of hypothetical materials made it possible to predict the performance of a variety of new materials tailored to specific applications.

Complex Systems for Design and Production

But engineers were also mastering the systems integration of automated support for development, design, production, and testing. Computer-aided design went far beyond the creation of mechanical parts and the layout of integrated circuits. The design process now became the output of a computer system driven by functional and materials specifications. This had already happened to microelectronic logic designers back in the 1970s. But now all branches of engineering showed improvements in productivity that came not so much from lower costs as from shorter development cycles. The key was the ability of the system to pass design data directly to the automated production tooling in electronic form, leading to the automatic generation of testing programs.

The process specifications were coded into the CAD/CAM system; of course, given the necessity for safety margins, these specifications had to be somewhat more conservative than laboratory measurements might justify. Thus the process rules incorporated in the automated tooling became the equivalent of the design data in the old handbooks. But now design margins could be kept tight, because the materials and processes were so much more accurately characterized.

The introduction of automation into industrial processes raised the level of skill necessary for software, test, and production systems engineering. The mathematical and computer/communications requirements were now quite demanding, and major research efforts in systems engineering were being pursued at all the major engineering schools. Students at Cornell's School of Engineering occasionally stopped by the exhibit installed for the school's centenary in 1985 to stare at the slide rule and soldering iron on display there. How could so much progress have been made with such primitive tools, they wondered?

Software and Systems Science

The evolution of the economy toward services and away from heavy industrial processing of raw materials had gone quite far by 1986. The shift in the US economic structure was well under way in the mid-1980s, and was reflected in the nearly disastrous trade wars at the end of the decade. But people soon realized that the largest part of US export income came from knowledge and services, and improving the productivity of the service sector became a high economic priority.

The universities responded in a number of innovative ways. Economics became more pragmatic, and microeconomic research enjoyed a major revitalization. The theory of human organizations was making great strides, and one of the most popular research goals in 2006 was to understand how, using modern information networks, it was possible to sustain both a high degree of spontaneous individual creativity and a high degree of common purpose and efficiency in the use of resources.

Software Engineering

A new "clinical" degree in software engineering-a four-year postgraduate degree equivalent to the M.D. plus internship-was now the most popular professional program in the universities. It offered assured employment in the management of the information resources that are the lifeblood of any large organization. The degree holders had mastered digital telecommunications and computer engineering, application analysis and modeling, complex systems management, microeconomics, and business management, and had a good understanding of the human beings who are the ultimate users of these systems.

Although the Japanese had made deep inroads in the American market for microelectronic hardware in the 1990s, the new thrust in the United States was not matched elsewhere. Our economy began to be based on the combination of American strengths in research, organization, and respect for individual creativity.

Publish or Perish

By 2006, the life of the typical academic scientist had not changed all that much, except in intellectual and technological terms. There were the usual number of academic committees trying to practice decision by consensus, although they mostly met in computer-video conferences (making it easier to work or sleep during the more tedious moments). Students still had to be taught, research accomplishments evaluated for promotions. But "publish or perish" now seemed more than ever an outdated caricature of how academic merit should be measured.

First of all, it was far less dear what "publish" meant, since the research output lived on the network and was incorporated in shared data collections. Its methodology was recorded in magneto-optic disks, embodied in video imagery, and even enshrined in the solutions of social problems professionally and effectively addressed. All these options made research much more interesting and accessible. But combined with the reunification of many of the disciplines, it made the task of academic administration much more difficult. While it was easier to find scholars outside the university to evaluate the work of a scientist being considered for tenure, it was virtually impossible to find any way to count the items of output to his or her credit.

Tenure and promotion committees, therefore, were forced to consider the substance of the case, not its form. But the tensions of the 1980s on the subject of employment were lessened by the resurgence of population pressure on the campuses. The babies of the 1980s had arrived.

Plus Ça Change

With all the change that had overtaken science in twenty years, the really important things never changed: the excitement of an unexpected discovery, the joy of a student who has made an intellectual leap beyond the work of the professor, the beauty of a three-dimensional color image of a virus, created out of theory and x-ray diffraction patterns, shimmering on a large graphics tube like a strange and wondrous bird on first discovery. In 2006 God still loved the noise as much as the signal. Man was still aware that with every step forward in science two delicious new questions-crying out for study-were born.





» Post Comment

 

EMAIL TO A FRIEND :

Subscribe to American Scientist