Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > PAST ISSUE > March-April 2000 > Article Detail

COMPUTING SCIENCE

Graph Theory in Practice: Part II

Brian Hayes

Graph Practice, in Theory

It may seem remarkable that the abstract and austere formalism of graph theory would prove useful in explaining such worldly phenomena as the architecture of the World Wide Web or the casting of Hollywood films or patterns of telephone calls. But graph theory is a branch of mathematics that has never been afraid to get its hands dirty with applications. Early on, it had close and fruitful encounters with organic chemistry and electrical engineering.

In another sense, though, the sudden blooming of graphs—the tendency to see them everywhere we look—is indeed new and noteworthy. For a long time, science has preferred to see the world through a grid of Cartesian coordinates, organizing everything around us according to row and column, rank and file, latitude and longitude. Although there have always been exceptions to this rectilinear prejudice—notably the treelike structures of taxonomists and grammarians—the simplicity of lattices has been hard to resist. And both lattices and trees have a convenient property that physicists call locality. The only way to move through a lattice or a tree is to crawl from one node to the next. There is no action at a distance; there are no secret wormholes connecting distant parts of the universe.

Graphs are a generalization of both lattices and trees. They admit more-flexible arrangements and less-regular connections in the way the world is put together. In graphs the principle of locality can be violated, as the shortcuts in the Watts-Strogatz model make explicit. Such subterranean channels make the world a harder place to comprehend, but perhaps a more interesting place to inhabit.

© Brian Hayes








comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Feature Article: Candy Crush's Puzzling Mathematics

Computing Science: Belles lettres Meets Big Data

Technologue: Quantum Randomness

Subscribe to American Scientist