Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > PAST ISSUE > Article Detail

COMPUTING SCIENCE

The Square Root of NOT

Brian Hayes

Digital computers are built out of circuits that have definite, discrete states: on or off, zero or one, high voltage or low voltage. Engineers go to great lengths to make sure these circuits never settle into some intermediate condition. Quantum-mechanical systems, as it happens, offer a guarantee of discreteness without any engineering effort at all. When you measure the spin orientation of an electron, for example, it is always either "up" or "down," never in between. Likewise an atom gains or loses energy by making a "quantum jump" between specific energy states, without passing through intermediate energy levels. So why not build a digital computer out of quantum-mechanical devices, letting particle spins or the energy levels of atoms stand for binary units of information?

One answer to this "Why not?" question is that you can't avoid building a quantum-mechanical computer even if you try. Since quantum mechanics appears to be a true theory of nature, it governs all physical systems, including the transistors and other components of the computer on your desk. All the same, quantum effects are seldom evident in electronic devices; components and circuits are designed so that the quantum states of many millions of electrons are averaged together, blurring their discreteness.

In a quantum computer, the basic working parts would probably have to be individual electrons or atoms, and so another answer to the "Why not?" question is that building such a machine is simply beyond our skills. And even apart from the challenges of atomic-scale fabrication, there are some ticklish conceptual issues. Quantum systems have some famously weird behavior, such as the phenomenon called quantum interference. Two nearby transistors can switch on and off independently, but two adjacent quantum objects (such as two electrons) are inextricably coupled, so that the future state of one electron cannot be predicted without taking into account the surrounding electrons. Indeed, an isolated electron can interfere with itself!

A third answer to the "Why not?" question is "Why bother?" Until recently there was no reason to believe that a quantum computer could do anything a classical computer couldn't. This situation has now changed dramatically. The exact place of quantum technology in the overall hierarchy of computing machines is still not settled, but a few recently discovered algorithms offer intriguing hints. It turns out that a program written for a quantum computer can factor large numbers faster than any known algorithm for a classical machine. The quantum factoring algorithm makes essential use of interference effects, which become a source of parallelism, allowing the computer to explore all possible solutions to a problem simultaneously. Factoring is a task of much theoretical interest, and it also has practical applications in cryptography, so these discoveries have attracted considerable notice.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Subscribe to American Scientist