Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > PAST ISSUE > Article Detail

COMPUTING SCIENCE

Programming Your Quantum Computer

The hardware doesn’t yet exist, but languages for quantum coding are ready to go.

Brian Hayes

Abstracted to Distraction

My complaint that quantum computation seems too much like a laboratory experiment is a little unfair. Classical computing has the same complexion if you examine it closely enough. Adding a column of numbers in a spreadsheet could be described as preparing a set of bits in the appropriate initial state, applying the summation operator, and measuring the final state of the bits. But no one thinks of the process in those primitive terms.

Computer science has evolved a hierarchy of conceptual layers that hide the details of layers below them. At the bottom are physical entities such as transistors and electronic circuitry. Next come logic gates (AND, OR, etc.), which operate on symbols (true and false, 0 and 1) rather than voltages and currents. The gates are assembled into registers, adders, and the like; then an instruction set defines commands for manipulating data within these components. Finally, the details of the instruction set are hidden by the constructs of a higher-level programming language: procedures, iterations, arrays, lists, and so on.

Creating complex software would be beyond human abilities without the abstraction barriers that separate these layers. It’s just not possible to think about the design of a large program in terms of electric currents flowing through billions of transistors. As Alfred North Whitehead wrote, “Civilisation advances by extending the number of important operations which we can perform without thinking about them.”

But the barriers are seldom perfect. Modern processor chips have multiple cores that execute streams of instructions in parallel; a programmer cannot take full advantage of that parallelism without thinking about lower-level details. Thus civilisation retreats a little. Quantum computing, too, will surely trespass on some abstraction barriers.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Letters to the Editors: The Truth about Models

Spotlight: Briefings

Computing Science: Belles lettres Meets Big Data

Subscribe to American Scientist