Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG

COMPUTING SCIENCE

Qwerks of History

Brian Hayes

Let me take you back to 1969, the year of Woodstock and of Neil Armstrong's giant leap for mankind. Nixon was in the White House that year, Elvis was playing Vegas, and the Beatles were recording Abbey Road. Bill Gates and Steve Jobs were high school kids. Such a long time ago it was—half a lifetime. In the world of computer technology, 1969 is halfway back to the dawn of time.

And yet, when you look closely at the latest computer hardware and software, it's not hard to find vestiges of 1969. Your shiny new Pentium processor can trace its heritage back to a chip whose design was begun that very year. The Unix operating system, whose offshoots are blooming luxuriantly these days, also has its roots in that flower-power summer. And the first four nodes of the ARPANET, progenitor of today's Internet, began exchanging packets in the last months of 1969. The window-and-menu interface we know so well was still a few years in the future; but, on the other hand, the computer mouse was already a few years old.

One reaction to the longevity of so much early computer technology is admiration for the pioneers of the field. They must have been clear-thinking and far-seeing innovators to get so much right on the first try. And it's true: Giants strode the earth in those days. Hats off to all of them!

But in celebrating the accomplishments of that golden age, I can't quite escape the obvious nagging question: What has everybody been doing for the past 35 years? Can it be true that technologies conceived in the era of time-sharing, teletypes and nine-track tape are the very best that computer science has to offer in the 21st century?




comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Computing Science: Belles lettres Meets Big Data

Technologue: Quantum Randomness

Technologue: The Quest for Randomness

Subscribe to American Scientist