Current Issue

This Article From Issue

July-August 2006

Volume 94, Number 4
Page 364

DOI: 10.1511/2006.60.364

Programming the Universe: A Quantum Computer Scientist Takes on the Cosmos. Seth Lloyd. xii + 221 pp. Alfred A. Knopf, 2006. $25.95.

In the 1940s, computer pioneer Konrad Zuse began to speculate that the universe might be nothing but a giant computer continually executing formal rules to compute its own evolution. He published the first paper on this radical idea in 1967, and since then it has provoked an ever-increasing response from popular culture (the film The Matrix, for example, owes a great deal to Zuse's theories) and hard science alike.

Ad Right

Given this backdrop, Seth Lloyd appears to be exaggerating when he claims in his informative and entertaining new book that he "advocates a new paradigm" by postulating the universe to be a machine that processes information. However, in the book, which is titled Programming the Universe, Lloyd does somewhat distinguish himself from his predecessors by focusing on the weird world of quantum computation. He lucidly explains what quantum computation is all about, how the process of quantum entanglement seems to involve an instantaneous exchange of information between locations that can be light-years apart, and why this phenomenon unfortunately cannot be exploited to transmit information faster than light. He also describes how quantum computers would be able to solve certain problems much faster than their traditional counterparts.

The book's central conceit is that the universe can be viewed as a giant quantum computer made up of connected quantum gates that flip quantum bits and thereby propagate information and uncertainty in an "infectious" way. Lloyd uses results by Hans Joachim Bremermann, Norman Margolus and Lev Levitin to calculate the processing power of the "ultimate laptop" (one with 1 kilogram of mass and 1 liter of volume): a maximum of 1051 operations per second on 1032 bits. A good fan would be needed, though: The massively parallel laptop would be roughly 100 times hotter than the center of the Sun. Lloyd also calculates that the visible universe has so far computed about 10122 operations on 1092 bits. Doesn't sound like a lot, does it?

Like most quantum physicists, Lloyd believes that there is a source of true randomness, which manifests itself whenever we measure a quantum bit. He does not explicitly mention, however, that this belief has never been experimentally proved. Neither Heisenberg's uncertainty principle nor Bell's inequality (which rests on the very assumption of randomness) excludes the possibility that the universe (including all observers inhabiting it) is in principle computable by a deterministic computer, as first suggested by Zuse.

In fact, Lloyd's belief in true randomness also seems inconsistent with his invocation of Ockham's razor, which favors simple explanations of the universe's history over complex ones. According to both standard and algorithmic information theory, true randomness actually corresponds to maximal information, complexity and description length—the opposite of simplicity.

As long as the (currently somewhat unfashionable) hypothesis of determinism remains unfalsified, many scientists will be dissatisfied by an explanation of our universe's history that requires an enormous amount of information to describe all the random events that have taken place, in addition to the known, compactly describable physical laws. Physicists should keep searching for simple, deterministic, pseudorandom computational rules explaining any type of hitherto-unexplained apparent randomness. Einstein, whose belief that "God does not play dice" has not yet been proved wrong, would probably agree.

The book is least convincing when it comes to the topics of complexity, entropy and algorithmic information. Lloyd compares random events at the quantum level to monkeys typing a random program on the universal computer; this is linked to Ray Solomonoff's basic concept of algorithmic probability theory—namely, that short random programs are more likely than long ones. However, the point of Solomonoff's approach is that some programs can remain short by ceasing to read new input bits. This essential feature seems absent from Lloyd's setup, which demands the permanent creation of new bits corresponding to never-ending programs, thus making each "program" extremely unlikely.

Of all the various complexity measures, Lloyd explicitly favors those that combine the difficulty of doing something with the difficulty of describing it. The obvious choice here would be Leonid A. Levin's Kt-complexity (and its probabilistic derivative, the Speed Prior), which trades off space and computation time in a way that is theoretically optimal in a certain sense. Lloyd, however, does not even mention it. Instead he focuses on measures based on Charles H. Bennett's logical depth, and something called "effective complexity," which reflects a preliminary attempt of physicist Murray Gell-Mann and Lloyd himself to hide the complexity of truly random bits. Generally speaking, the connections between Lloyd's model of quantum processing and algorithmic information theory seem vague.

Some of Lloyd's statements reflect a certain naiveté about some topics in computer science. For example, he writes that "According to the Church-Turing hypothesis, every possible mathematical structure is represented in some component of the superposition" of all possible computations, despite the fact that much of mathematics deals with incomputable objects. Other incorrect statements include this one: "In fact, every universal computer can be shown not only to simulate every other universal computer, but to do so efficiently."

Lloyd's historical notes on computation and bits refer to Charles Babbage's analytic engine and John Napier's logarithmic bones but fail to mention Gottfried Wilhelm Leibniz, the inventor of the bit (1700), and Wilhelm Schickard, constructor of the very first (non-program-controlled) computer, in 1623. For some reason Lloyd also seems to give equal credit to Zuse and Ed Fredkin as creators of the "universe as a computer" idea, although Fredkin got into this business long after Zuse. On the other hand, Lloyd did enrich my understanding by pointing out that the "many worlds" theory (usually attributed to the physicist Hugh Everett) can be traced back to poet and novelist Jorge Luis Borges.

Lloyd spices his story with interesting and sometimes touching personal tales of his career at the border between computer science and physics. Despite my few quibbles, I recommend this well-written book without hesitation to anybody interested in an overview of basic ideas in the field. I intend to buy a few copies as presents for my friends.

American Scientist Comments and Discussion

To discuss our articles or comment on them, please share them and tag American Scientist on social media platforms. Here are links to our profiles on Twitter, Facebook, and LinkedIn.

If we re-share your post, we will moderate comments/discussion following our comments policy.