Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG
HOME > PAST ISSUE > Article Detail

COMPUTING SCIENCE

Flights of Fancy

How birds (and bird-watchers) compute the behavior of a flock on the wing

Brian Hayes

Birds and Blobs

Making the photos was only the first challenge; making sense of them was harder. Even with high-resolution images, a starling 100 meters from the camera is little more than a dark blob, with no individually distinguishable features. Given a pair of images that show the same configuration of birds from different points of view, how do you match up blobs that correspond to the same bird? Then how do you solve the similar problem of tracking a bird through successive frames? A further complication is that blobs often overlap.

To solve these problems the group built a hierarchy of image-analysis tools. The first step removes background features such as clouds, isolating the dark blobs that represent birds. Then a “blob-splitting” process breaks up overlapping bird images. In this way the photograph is reduced to an array of point coordinates in the two-dimensional plane of the image.

The leap into the third dimension requires combining information from multiple cameras. The basic geometric principle is well-known: An object photographed from two different points of view appears at different positions in the two images. This stereoscopic disparity provides information about the object’s range, or distance from the cameras. The complication is that the photographs include a few thousand objects that all look alike. Before calculating ranges, it’s necessary to figure out which blobs in image A go with which blobs in image B.

The STARFLAG group developed a multistage process for solving this matching problem. First a pattern-recognition algorithm searches for “constellations” of points in image A whose arrangement is distinctive enough that the same points can also be identified in B, even though the constellation will be distorted somewhat by the shift in point of view. Finding about 50 such matched pairs provides enough information to approximate the geometric transformation that maps any point in A to the corresponding pixel position in B. The approximation is refined by exploiting images from a third camera, C, mounted close to A. The short baseline and small optical disparity make the matching problem easier for A–C image pairs. And once the A–C matches are found, they can guide the search for more A–B matches. On average, the algorithm assigned 3D positions to 88 percent of the birds in the images analyzed.

Tracking a bird through time, and thereby learning its velocity as well as its position, entails matching across many successive frames rather than just the two images of a stereo pair. This is a challenging task, but the precise coordination of starling flocks helps to make it feasible. Because at any instant most of the birds are flying at the same speed and in the same direction, the average velocity vector gives a very good clue about where to look for a bird in the next frame.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Feature Article: Why Some Animals Forgo Reproduction in Complex Societies

Letters to the Editors: Rodents of Unusual Size

Letters to the Editors: When Horses Fly

Subscribe to American Scientist