Information, Reimagined

A new biography introduces readers to Claude Shannon: inveterate innovator, free-spirited juggler and unicyclist, and the mathematician credited with founding information theory.

Anthropology Mathematics Psychology Information Theory Review Scientists Nightstand

Current Issue

This Article From Issue

January-February 2018

Volume 106, Number 1
Page 54

DOI: 10.1511/2018.106.1.54

A Mind at Play: How Claude Shannon Invented the Information Age. Jimmy Soni and Rob Goodman. 366 pp. Simon and Schuster, 2017. $27.


A Mind at Play, Jimmy Soni and Rob Goodman’s new biography of Claude Shannon, the mathematician considered the father of information theory, introduces us to its subject with an anecdote: After falling out of sight during the 1960s, Shannon made an unannounced appearance in 1985 at the International Information Theory Symposium in Brighton, England. The shy, white-haired celebrity was eventually spotted and soon afterward mobbed by autograph-seeking fans. Persuaded by the symposium chairman to come to the podium at the evening banquet, the reluctant Shannon had to endure hearing himself introduced as “one of the greatest scientific minds of our time.” When the cheering and applause finally subsided, Shannon could only say, “This is—ridiculous!” He reached into his pocket, produced three balls, and began to juggle.

 

Photo courtesy of the Shannon family

Ad Right

The chairman later described the bizarre scene: “It was as if Newton had showed up at a physics conference.” Although hyperbolic (Newton, as far as we know, could not juggle), his summary expresses an admiration for Shannon that has only grown stronger through the years. It reached dramatic height last year, the centennial of Shannon’s birth. Celebratory conferences were held around the world. A Google doodle marked the day, April 30, 1916, when Claude Elwood Shannon was born in Petoskey, Michigan. One wonders what Shannon would have thought of all the fuss.

The fuss, however, is understandable: Shannon’s landmark innovations— especially in laying theoretical groundwork for encoding messages for transmission and by determining how digital circuits could be designed—link him inextricably to today’s information age. And in the wake of the centennial, Soni, a journalist, and Goodman, a writer and political scientist, have handily supplied curious readers with more of the modest mathematician’s story.

Shannon’s most productive years, those between 1940 and the mid-1950s, were spent in Manhattan at Bell Telephone Laboratories (which later moved to Murray Hill, New Jersey). During World War II he worked on a variety of projects involving electronics and cryptography. However, Shannon’s enduring fame rests mainly on his landmark paper, “A Mathematical Theory of Communication,” published in 1948 in the Bell System Technical Journal and republished by University of Illinois Press in 1963.

In the short paper Shannon considered the problem of transmitting digital data (that is, sequences of zeroes and ones) along a noisy channel. Many had believed that in order to increase the rate at which information can be transmitted, one should simply increase the power of the signal source. Building on earlier work by Harry Nyquist and Ralph Hartley, two colleagues at Bell Labs, Shannon showed that in fact there is a maximum rate of transmission over any channel. Assuming that the channel interference is caused by white noise, Shannon gave an easily computable formula for the maximum rate in terms of bandwidth and signal-to-noise ratio. Its calculated rate is a sharp maximum, meaning that it can be approached as closely as we desire, but it can never be exceeded.

Any transmission is vulnerable to error—random zeroes received as ones or vice versa. Shannon showed that if the transmission rate is less than the maximum, then there exist ways to send the data (by “coding” the transmission) so that the probability of error can be made arbitrarily small. The work of finding such codes, however, was left to others who took up the challenge. Today, data compression algorithms that rely on Shannon’s theorems are used for an array of digital tasks, from recording music to sending pictures from Mars.

An abstract interpretation of the word information lies at the heart of Shannon’s theory. Gone are semantic meanings. Any string of zeroes and ones satisfying a particular list of rules (for example, “zero cannot be followed immediately by zero”) could be acceptable. English words can be communicated in this way by assigning different strings of zeroes and ones to individual letters (including an additional “letter” for a space).

As Shannon observed, our language has a certain amount of redundancy built in. For example, you can read this sentence but, as Soni and Goodman relay, Shannon observed that “MST PPL HV LTTL DFFCLTY N RDNG THS SNTNC” as well—a condition familiar to anyone who sends text messages. Shannon gave a definition for the amount of information transmitted in a message. He then defined the rate of information transmitted, which he called entropy. For example, if we restrict ourselves to messages of zeroes and ones, then a source that can produce only ones would have zero entropy, whereas a source that produces zeroes and ones with the flip of a coin would have the largest possible entropy.

Soni and Goodman relate a famous story about Shannon’s choice of the word “entropy.” The mathematician John von Neumann noted the uncanny similarity between Shannon’s notion and one that had been used in thermodynamics for decades. “You should call it entropy, for two reasons,” von Neumann advised. “In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one knows what entropy really is, so in a debate you will always have the advantage.”

Shannon’s appropriation of the term “entropy” inspired a productive debate about deep connections between information and thermodynamics. Mathematicians in probability and dynamical systems found that it could be extended and used effectively in their own work.

“Almost certainly, this conversation never happened,” insist the authors, echoing doubts raised elsewhere. However, Shannon himself related the story, exactly as above, in a 1971 interview with the engineer Myron Tribus. Regardless of whether the story is true, Shannon’s appropriation of the term “entropy” inspired a productive debate about deep connections between information and thermodynamics. Mathematicians who work in the areas of probability and dynamical systems then heard about Shannon’s definition and found that it could be extended and used effectively in their own work. It is not difficult to imagine that von Neumann, one of the greatest mathematical minds of the 20th century, anticipated some of these later developments.

It is a temptation to look back at the early life of a genius and search for signs that promise future greatness. In the case of Claude Shannon, however, we find few indicators. We read that Shannon won a third-grade Thanksgiving story-writing contest and that he played alto horn in school musicals. He loved to build and fix things, especially radios, as did many youngsters in the 1920s. He found mathematics easy and enjoyed its competitive aspects, but no evidence is offered of exceptional mathematical ability.

What little is revealed about Shannon’s college career also fails to predict eminence. He attended the University of Michigan, where he earned dual bachelor degrees in mathematics and electrical engineering. He was elected to Phi Kappa Phi and Sigma Xi honor societies. He published two solutions to questions proposed in the American Mathematical Monthly, an expository journal intended for both students and faculty. These accomplishments are laudable but certainly not rare. Unfortunately, we don’t learn who Shannon’s teachers were or what mathematics and science courses he took at the University of Michigan. Such information might help anticipate the first blaze of Shannon’s genius, his master’s thesis completed in 1937.

Serendipity is a standard ingredient of notable careers, and for Shannon it was added during his master’s program, in the spring of 1936, when he noticed a typed card stuck to a bulletin board. It advertised a graduate assistantship at the Massachusetts Institute of Technology with the duty of running a differential analyzer, a mechanical computer designed to solve differential and integral equations. Such analog computers had been around since 1876, but this one also had some digital components and was the first capable of general applications. Eventually it would solve differential equations with 18 independent variables. Its inventors were Harold Hazen and Vannevar Bush. “I pushed hard for that job and got it,” Shannon recalled. “That was one of the luckiest things of my life.”

Vannevar Bush was a tall figure in American science. He had joined MIT’s electrical engineering department in 1919 and three years later founded a military supplier that is now called the Raytheon Company. In 1941, Bush would help convince President Roosevelt to begin building an atomic bomb, and he would take a leading role in its development. At MIT Bush recognized Shannon’s brilliance and took a serious interest in his career, guiding him through graduate school and on to Bell Labs.

Shannon’s thesis, A Symbolic Analysis of Relay and Switching Circuits, is regarded as one of the most important master’s theses ever written. Completed in 1937, it used century-old ideas of the British logician George Boole to simplify the arrangement of relays comprising electrical networks. Elegant and practical, Shannon’s system provided a basis for modern digital circuit design. Most mathematicians who teach applications of Boolean algebra to electrical circuits in courses of discrete mathematics do not realize they are presenting the ideas in Shannon’s thesis. More than 50 years later, Shannon downplayed the significance of his discovery. “It just happened no one else was familiar with both fields at the same time,” he told an interviewer, adding, “I’ve always loved that word. `Boolean.’”

Bush was not only a good judge of intellect, he was also a shrewd observer of temperament. He might have been worried about his new protégé. Shannon had lost his father in his sophomore year, and for some reason stopped speaking to his mother shortly afterward. Bush encouraged Shannon to spend time at Cold Springs Harbor Laboratory and apply Boolean algebra to Mendelian genetics. He would be supervised by Barbara Stoddard Burks, a sympathetic psychologist interested in the genetics of genius and keenly interested in questions of nature versus nurture. The Genetics Records Office at Cold Springs Harbor had more than 25 years of data for Shannon to contemplate. In less than one year Shannon had learned enough of genetics to complete his Ph.D. dissertation, An Algebra for Theoretical Genetics, a masterful but overly theoretical work that would have little to offer geneticists. The experience confirmed the opinion that Bush and Burks shared: Shannon was a genius who could acquire knowledge of a new subject quickly and from it create significant mathematics. However, Shannon had little regard for the work. He fled the field and never bothered to publish his dissertation. Some years later he remarked, “I had a good time acting as a geneticist for a couple of years.”

After receiving his doctorate Shannon spent a summer at Bell Labs, a year at the Institute for Advanced Study in Princeton, and finally found full-time employment back at Bell Labs.

Shannon was fortunate to work at Bell Labs during a period when research and development in the United States was generously funded. His brilliance entitled him to a freedom that seems impossible today, in a time of international competition and demands by shareholders for fast profits. With characteristic modesty, Shannon once admitted to a supervisor, “It always seemed to me that the freedom I took [at the Labs] was something of a special favor.”

Lured by a change of scene and the relative security of academia, Shannon accepted a position at MIT in 1958. He retired in 1978. The good luck that had followed him for so long finally departed in the early 1980s as Shannon began displaying signs of Alzheimer’s disease. He died from the illness in 2001.

A prolific tinkerer with a singular sense of humor, Shannon invented bizarre devices, including a calculator that operates with Roman numerals.

A Mind at Play is a loving biography recounted by two admirers of Claude Shannon. It is especially good at relating the many stories that have contributed to the growing fascination with its hero. A prolific tinkerer with a singular sense of humor, Shannon invented bizarre and amusing devices, many of which are described. They included a motorized pogo stick, a rocket-powered frisbee disk, a juggling machine, a calculator that operates with Roman numerals, and a relay-controlled robotic mouse that could solve a maze and keep track of its solution. An invention of Shannon’s that became known as the “Ultimate Machine” fascinated science-fiction writer Arthur C. Clarke during a visit to Bell Labs. In his 1958 book Voice across the Sea, Clarke offered a description of the machine’s workings: “When you throw the switch, there is an angry, purposeful buzzing. The lid slowly rises, and from beneath it emerges a hand. The hand reaches down, turns the switch off, and retreats into the box. With the finality of a closing coffin, the lid snaps shut, the buzzing ceases, and peace reigns once more.”

A Mind at Play is somewhat less successful when mathematics appears. For example, both the conclusion of Shannon’s “Theorem on Color Coding” and Hartley’s formula for information are misstated. The authors do an admirable job of describing Shannon’s entropy for a coin toss, but they stop short of explaining it for a more general information source. Readers wishing to learn details of Shannon’s work would do better to go to Shannon’s papers, which are well written and freely available online.

More distressing than minor technical slips is the authors’ discussion of the criticism that followed publication of The Mathematical Theory of Communication. After citing a sharp comment by probabilist Joseph Doob in a review, the authors imagine that pure mathematicians formed a cabal to condemn Shannon’s applied work. Certainly Shannon’s definitions and proofs were not always complete and correct. (For example, Shannon’s theorem about the optimum use of noisy channels by coding, discussed previously, was finally proved by Amiel Feinstein in 1954, and today it is known as the Shannon-Feinstein theorem.) Nevertheless, Shannon’s work was and continues to be used and admired by the mathematical community. Mathematical Reviews, in which Doob published his odd remark, contains nearly 2,000 reviews that refer to Shannon’s entropy.

Shannon did more than open up the new field of information theory. He also demonstrated what can be accomplished by combining passionate inquiry with a fondness for levity. A Mind at Play is an enjoyable biography that unites us with the singular spirit of Claude Shannon.


Daniel S. Silver is an emeritus professor of mathematics at the University of South Alabama. His research explores the relation between knots and dynamical systems, as well as the history of science and the psychology of invention.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.