Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > PAST ISSUE > Article Detail

COMPUTING SCIENCE

How to Avoid Yourself

Brian Hayes

I Wonder as I Wander

In trying to understand the self-avoiding walk, a good place to begin is with a walk that doesn't bother to avoid itself but lurches over the landscape entirely at random. At each step of such a walk you choose one of the four neighboring lattice sites with equal probability, and move there. If you repeat this process a few hundred times, and draw a line behind you as you go, the result is a scribble with a random but nonetheless distinctive and recognizable geometry.

How many different random paths can be traced on a square lattice? They are easy to count. From any fixed point of origin there are just four walks consisting of a single step, namely those going one unit north, east, south or west. On the second step each of these walks can be continued in any of four directions, and so there are 16 two-step walks. For every further step the number of walks is again multiplied by 4, so that the number of n-step walks is 4n.

An interesting question to ask about random walks is whether the walker ever returns to the starting point. Eighty years ago George Pólya showed that the answer depends on the dimensionality of the lattice. In one or two dimensions a random walker is certain to come back home if the walk continues long enough. But three or more dimensions offer enough room to get lost in, and a return cannot be guaranteed no matter how long the walk goes on.

Pólya's result immediately tells us something about self-avoiding walks on a two-dimensional lattice. If a random walk's probability of returning home is 1, the probability of not revisiting the origin must be 0. And since the origin is one of the places that self-avoiding walks avoid, an arbitrarily long self-avoiding walk must be highly improbable—so rare and exceptional that you have almost no chance of finding one. This scarcity is one reason self-avoiding walks are so hard to study. And yet, paradoxically, another reason is that they're so numerous it's a challenge to count them all.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Computing Science: Belles lettres Meets Big Data

Technologue: Quantum Randomness

Technologue: The Quest for Randomness

Subscribe to American Scientist