FEATURE ARTICLE

# Knowing When to Stop

How to gamble if you must—the mathematics of optimal stopping

# The Marriage Problem

Suppose you decide to marry, and to select your life partner you will interview at most 100 candidate spouses. The interviews are arranged in random order, and you have no information about candidates you haven’t yet spoken to. After each interview you must either marry that person or forever lose the chance to do so. If you have not married after interviewing candidate 99, you must marry candidate 100. Your objective, of course, is to marry the absolute best candidate of the lot. But how?

This problem has a long and rich history in the mathematics literature, where it is known variously as the marriage, secretary, dowry or best-choice problem. Certainly you can select the very best spouse with probability at least 1/100, simply by marrying the first person. But can you do better? In fact, there is a simple rule that will guarantee you will marry the absolute best more than one-third of the time. And the rule can be transferred to other scenarios.

As an enlisted man in the U.S. Air Force during the Vietnam era, John Elton, now a Georgia Institute of Technology mathematician, transformed the marriage problem into a barracks moneymaking scheme. Elton asked his fellow airmen to write down 100 different numbers, positive or negative, as large or small as they wished, on 100 slips of paper, turn them face down on a table and mix them up. He would bet them he could turn the slips of paper over one at a time and stop with the highest number. He convinced them it was “obvious” that the chance of him winning was very small, so he asked ten dollars if he won and paid them one dollar if he lost. There was no shortage of bettors. Even though my friend lost nearly two-thirds of the games, he won more than one-third of them. And with the 10-1 odds, he raked in a bundle. How?

First, note that there is a very simple strategy for winning more than one-fourth of the time, which would already put him ahead. Call an observed number a “record” if it is the highest number seen so far. Suppose you turn over half the numbers—or interview the first 50 marriage candidates—and never stop, no matter how high the number. After that you stop with the first record you see. If the second-highest number in the 100 cards happens to be in the first 50 you look at, and the highest in the second half—which happens 1 in 4 times—then you win. That strategy is good, but there is an even better one. Observe only 37 cards (or potential partners) without stopping and then stop with the next record. John Gilbert and Frederick Mosteller of Harvard University proved that this strategy is best and guarantees stopping with the best number about 37 percent of the time. In fact, observing
*
N/e *≅ 0.37 of the candidates, where
*
N
*
is the total number of candidates and
*
e
*
is the base of the natural logarithms
*
, e
*
= 2.71828…, guarantees winning with probability more than
*
1/e
*
> 0.36, no matter how many cards or candidates there are. (Note that the “observe half the numbers” strategy clearly wins with probability at least ¼, also independent of the number of cards.)

Sometimes the goal is to stop with one of the best
*
k
*
of
*
N
*
candidates. That is, you win if you stop with any one of the highest
*
k
*
numbers. In the Olympics or in horse racing, for example, the objective often is the
*
k
*
= 3 case—to win a medal or to show—rather than the all-or-nothing
*
k
*
= 1 goal of a gold medal or a win, which is much riskier. The optimal strategy for stopping with one of the best
*
k
*
is similar to stopping with the best. First, you should observe a fixed number of candidates without ever stopping, thereby obtaining a baseline to work with. Then for another certain fixed length of time, stop if you see a record. Since it is the best seen so far, it is somewhat likely to be one of the best
*
k*. If no record appears during that stretch, then continue to the next stage where you stop with one of the highest two numbers for a fixed period of time, and so on. For
*
k
*
= 2, this method guarantees a better than 57 percent chance of stopping with one of the two best even if there are a million cards. For small
*
N
*
, the probability is quite high. Figure 3 illustrates the optimal strategy for
*
N
*
= 7 and
*
k
*
= 2, which guarantees a win two-thirds of the time.

EMAIL TO A FRIEND :

**Of Possible Interest**

**Spotlight**: Flint Water Crisis Yields Hard Lessons in Science and Ethics

**Perspective**: Taking the Long View on Sexism in Science

**Computing Science**: Computer Vision and Computer Hallucinations

**Other Related Links**

Optimal stopping and applications

Best choice problems (Visual Wikipedia)