COMPUTING SCIENCE

# Randomness as a Resource

# Supply-Side Issues

Whatever the purpose of randomness, and however light or heavy the demand, it seems like producing the stuff ought to be a cinch. At the very least it should be easier to make random bits than non-random ones, in the same way that it’s easier to make a mess than it is to tidy up. If computers can perform long and intricate calculations where a single error could spoil the entire result, then surely they should be able to churn out some patternless digital junk. But they can't. There is no computer program for randomness.

Of course most computer programming languages will cheerfully offer to generate random numbers for you. In Lisp the expression *(random 100)* produces an integer in the range between 0 and 99, with each of the 100 possible values having equal probability. But these are *pseudo*-random numbers: They "look" random, but under the surface there is nothing unpredictable about them. Each number in the series depends on those that went before. You may not immediately perceive the rule in a series like 58, 23, 0, 79, 48..., but it’s just as deterministic as 1, 2, 3, 4....

The only source of true randomness in a sequence of pseudo-random numbers is a "seed" value that gets the series started. If you supply identical seeds, you get identical sequences; different seeds produce different numbers. The crucial role of the seed was made clear in the 1980s by Manuel Blum, now of Carnegie Mellon University. He pointed out that a pseudo-random generator does not actually generate any randomness; it stretches or dilutes whatever randomness is in the seed, spreading it out over a longer series of numbers like a drop of pigment mixed into a gallon of paint.

For most purposes, pseudo-random numbers serve perfectly well—often better than true random numbers. Almost all Monte Carlo work is based on them. Even for some cryptographic applications—where standards are higher and unpredictability is everything—Blum and others have invented pseudo-random generators that meet most needs. Nevertheless, true randomness is still in demand, if only to supply seeds for pseudo-random generators. And if true randomness cannot be created in any mathematical operation, then it will have to come from some physical process.

Extracting randomness from the material world also sounds like an easy enough job. Unpredictable events are all around us: the stock market tomorrow, the weather next week, the orbital position of Pluto in 50 million years. Yet finding events that are *totally* patternless turns out to be quite difficult. The stories of the pioneering seekers after randomness are chronicles of travail and disappointment.

Consider the experience of the British biometrician W. F. R. Weldon and his wife, the former Florence Tebb. Evidently they spent many an evening rolling dice together—not for money or sport but for science, collecting data for a classroom demonstration of the laws of probability. But in 1900 Karl Pearson analyzed 26,306 of the Weldons’ throws and found deviations from those laws; there was an excess of fives and sixes.

In 1901 Lord Kelvin tried to carry out what we would now call a Monte Carlo experiment, but he ran into trouble generating random numbers. In a footnote he wrote: "I had tried numbered billets (small squares of paper) drawn from a bowl, but found this very unsatisfactory. The best mixing we could make in the bowl seemed to be quite insufficient to secure equal chances for all the billets."

In 1925 L. H. C. Tippett had the same problem. Trying to make a random selection from a thousand cards in a bag, "it was concluded that the mixing between each draw had not been sufficient, and there was a tendency for neighbouring draws to be alike." Tippett devised a more elaborate randomizing procedure, and two years later he published a table of 41,600 random digits. But in 1938 G. Udny Yule submitted Tippett's numbers to statistical scrutiny and reported evidence of "patchiness."

Ronald A. Fisher and Frank Yates compiled another table of 15,000 random digits, using two decks of playing cards to select numbers from a large table of logarithms. When they were done, they discovered an excess of sixes, and so they replaced 50 of them with other digits "selected at random." (Two of their statistical colleagues, Maurice G. Kendall and Bernard Babington Smith, comment mildly: "A procedure of this kind may cause others, as it did us, some misgiving.")

The ultimate random-number table arrived with a thump in 1955, when the Rand Corporation published a 600-page tome titled *A Million Random Digits with 100,000 Normal Deviates*. The Rand randomizers used "an electronic roulette wheel" that selected one digit per second. Despite the care taken in the construction of this device, "Production from the original machine showed statistically significant biases, and the engineers had to make several modifications and refinements of the circuits." Even after this tune-up, the results of the month-long run were still unsatisfactory; Rand had to remix and shuffle the numbers before the tables passed statistical tests.

Today there is little interest in publishing tables of numbers, but machines for generating randomness are still being built. Many of them find their source of disorder in the thermal fluctuations of electrons wandering through a resistor or a semiconductor junction. This noisy signal is the hiss or whoosh you hear when you turn up an amplifier’s volume control. Traced by an oscilloscope, it certainly looks random and unpredictable, but converting it into a stream of random bits or numbers is not straightforward.

The obvious scheme for digitizing noise is to measure the signal at certain instants and emit a 1 if the voltage is positive or a 0 if it is negative. But it’s hard to build a measuring circuit with a precise and consistent threshold between positive and negative voltage. As components age, the threshold drifts, causing a bias in the balance between 1s and 0s. There are circuits and computational tricks to correct this problem, but the need for such fixes suggests just how messy it can be getting a physical device to conform to a mathematical ideal—even when the ideal is that of pure messiness.

Another popular source of randomness is the radioactive decay of atomic nuclei, a quantum phenomenon that seems to be near the ultimate in unpredictability. A simple random-number generator based on this effect might work as follows. A Geiger-Müller tube detects a decay event, while in the background a free-running oscillator generates a high-frequency square-wave signal—a train of positive and negative pulses. At the instant of a nuclear decay, the square wave is sampled, and a binary 1 or 0 is output according to the polarity of the pulse at that moment. Again there are engineering pitfalls. For example, the circuitry’s "dead time" after each event may block detection of closely spaced decays. And if the positive and negative pulses in the square wave differ in length even slightly, the output will be biased.

Hardware random-number generators are available as off-the-shelf components you can plug into a port of your computer. Most of them rely on thermal electronic noise. If your computer has one of the latest Intel Pentium processors, you don't need to plug in a peripheral: The random-number generator is built into the CPU chip. There are also several Web sites that serve up free samples of randomness. George Marsaglia of Florida State University has some 4.8 billion carefully tested random bits available to the public. And there are less-conventional sources of randomness, most famously "lavarand," at Silicon Graphics, where random bits are extracted from images of the erupting blobs inside six Lava Lite lamps. (Lately the lamps have gone out, although samples remain available at lavarand.sgi.com.)

**IN THIS SECTION**

Community Guidelines: Disqus Comments

The Art and Science of Communicating Science

EMAIL TO A FRIEND :

**Of Possible Interest**

**Feature Article**: Like Holding a Piece of Sky

**Feature Article**: The Statistical Crisis in Science

**Computing Science**: Clarity in Climate Modeling