Logo IMG
HOME > PAST ISSUE > Article Detail


Automation on the Job

Computers were supposed to be labor-saving devices. How come we're still working so hard?

Brian Hayes

The Full-Employment Paradox

The%20workweek%20in%20the%20U.S.Click to Enlarge ImageEnabling people to place their own phone calls and make their own travel reservations has put whole categories of jobs on the brink of extinction. U.S. telephone companies once employed more than 250,000 telephone operators; the number remaining is a tenth of that, and falling fast. It’s the same story for gas-station attendants, elevator operators and dozens of other occupations. And yet we have not seen the great contraction of the workforce that seemed inevitable 50 years ago.

One oft-heard explanation holds that automation brings a net increase in employment by creating jobs for people who design, build and maintain machines. A strong version of this thesis is scarcely plausible. It implies that the total labor requirement per unit of output is higher in the automated process than in the manual one; if that were the case, it would be hard to see the economic incentive for adopting automation. A weaker but likelier version concedes that labor per unit of output declines under automation, but total output increases enough to compensate. Even for this weaker prediction, however, there is no guarantee of such a rosy outcome. The relation may well be supported by historical evidence, but it has no theoretical underpinning in economic principles.

For a theoretical analysis we can turn to Herbert A. Simon, who was both an economist and a computer scientist and would thus seem to be the ideal analyst. In a 1965 essay, Simon noted that economies seek equilibrium, and so “both men and machines can be fully employed regardless of their relative productivity.” It’s just a matter of adjusting the worker’s wage until it balances the cost of machinery. Of course there’s no guarantee that the equilibrium wage will be above the subsistence level. But Simon then offered a more complex argument showing that any increase in productivity, whatever the underlying cause, should increase wages as well as the return on capital investment. Do these two results add up to perpetual full employment at a living wage in an automated world? I don’t believe they offer any such guarantee, but perhaps the calculations are reassuring nonetheless.

Another kind of economic equilibrium also offers a measure of cheer. The premise is that whatever you earn, you eventually spend. (Or else your heirs spend it for you.) If technological progress makes some commodity cheaper, then the money that used to go that product will have to be spent on something else. The flow of funds toward the alternative sectors will drive up prices there and create new economic opportunities. This mode of reasoning offers an answer to questions such as, “Why has health care become so expensive in recent years?” The answer is: Because everything else has gotten so cheap.

I can’t say that any of these formulations puts my mind at ease. On the other hand, I do have faith in the resilience of people and societies. The demographic history of agriculture offers a precedent that is both sobering and reassuring. It’s not too much of an exaggeration to say that before 1800 everyone in North America was a farmer, and now no one is. In other words, productivity gains in agriculture put an entire population out of work. This was a wrenching experience for those forced to leave the farm, but the fact remains that they survived and found other ways of life. The occupational shifts caused by computers and automation cannot possibly match the magnitude of that great upheaval.

comments powered by Disqus


Subscribe to American Scientist