MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
RSS
Logo IMG
HOME > PAST ISSUE > May-June 2006 > Article Detail

COMPUTING SCIENCE

Reverse Engineering

backward and forward both run to need may they ,faster run to are computers If

Brian Hayes

Zeptojoules

For a long time it was taken for granted that storing, manipulating and transmitting information would always necessarily dissipate some nonzero quantity of energy. Engineering prowess might lower the energy cost somewhat, but there was a threshold level, a lower limit we could never cross. A device that could compute without loss of energy was seen as the information equivalent of a perpetual-motion machine.

John von Neumann, in a 1949 lecture, set the minimum price of "an elementary act of information" at kT ln 2. In this formula k is Boltzmann's constant, which is the conversion factor for expressing temperature in energy units; its numerical value is 1.4 x 10-23 joules per kelvin. T is the absolute temperature, and ln 2 is the natural logarithm of 2, a number that appears here because it corresponds to one bit of information—the amount of information needed to distinguish between two equally likely alternatives. At room temperature (300 kelvins), kT ln 2 works out to about 3 x 10-21 joule, or 3 zeptojoules. This is a minuscule amount of energy; Ralph C. Merkle of the Georgia Institute of Technology estimates that it is the average kinetic energy of a single air molecule at room temperature.

Von Neumann's pronouncement was based on a thermodynamic argument. Consider a computation that answers a single yes/no question, where the two possible outcomes appear equally likely at the outset. Once the question has been settled, we know one bit more than we did beforehand, and so the computational process reduces the uncertainty or entropy of the computing system by one bit. But the second law of thermodynamics says that total entropy can never decrease, and so the reduction inside the computer must be compensated by an entropy increase elsewhere. Specifically, the computer must stir up at least one bit's worth of disorder in its surroundings by expelling an amount of heat equal to kT ln 2. Von Neumann—along with everyone else at the time—assumed that every "elementary act of information" has the effect of settling at least one yes/no question, and thus it seemed that each step in the computer's operation inevitably dissipates at least three zeptojoules of energy.

Von Neumann's ideas on the thermodynamics of computation were widely accepted but never formally proved. In the early 1960s Rolf Landauer set out to supply such a proof and found that he couldn't. He discovered that only a certain subclass of computational events have an unavoidable three-zeptojoule cost. Ironically, these expensive operations are not those that produce information but rather those that destroy it, such as erasing a bit from a memory cell.

Landauer's work on the cost of forgetting was counter-intuitive, and initially it got a frosty reception. Now the idea has been thoroughly assimilated, and it's hard to see what the controversy was all about. Erasing a memory cell amounts to ignoring its present contents—which may in fact be unknown—and resetting the cell to some standardized state (usually 0). Thus an indeterminate bit becomes a fully specified one, and the entropy of the machine is diminished accordingly. For this reason the corresponding amount of heat energy (kT ln 2) has to be rejected into the environment. The consequences are even clearer if you think about erasing the entire memory of a computer, so that the system goes from a random state to a highly ordered one; this is a process of refrigeration, and so it obviously gives off heat.





» Post Comment

 

EMAIL TO A FRIEND :

Of Possible Interest

Feature Article: Social Media Monitors the Largest Fish in the Sea

Feature Article: Engineered Molecules for Smarter Medicines

Letters to the Editors: Long Live the Data!

Subscribe to American Scientist