Subscribe
Subscribe
MY AMERICAN SCIENTIST
LOG IN! REGISTER!
SEARCH
 
Logo IMG

FEATURE ARTICLE

The Experimental Analysis of Behavior

The 1957 American Scientist article, reproduced in full

B. F. Skinner

Human Behavior

What about man? Is rate of responding still an orderly and meaningful datum here, or is human behavior the exception in which spontaneity and caprice still reign? In watching experiments of the sort described above, most people feel that they could “figure out” a schedule of reinforcement and adjust to it more efficiently than the experimental organism. In saying this, they are probably overlooking the clocks and calendars, the counters and the behavior of counting, with which man has solved the problem of intermittency in his environment. But if a pigeon is given a clock or a counter, it works more efficiently [19], and without these aids man shows little if any superiority.

Parallels have already been suggested between human and infrahuman behavior in noting the similarity of fixed-ratio schedules to piece-rate pay and of variable ratios to the schedules in gambling devices. These are more than mere analogies. Comparable effects of schedules of reinforcement in man and the other animals are gradually being established by direct experimentation. An example is some work by James Holland [20] at the Naval Research Laboratories on the behavior of observing. We often forget that looking at a visual pattern or listening to a sound is itself behavior, because we are likely to be impressed by the more important behavior which the pattern or sound controls. But any act which brings an organism into contact with a discriminative stimulus, or clarifies or intensifies its effect, is reinforced by this result and must be explained in such terms. Unfortunately mere “attending” (as in reading a book or listening to a concert) has dimensions which are difficult to study. But behavior with comparable effects is sometimes accessible, such as turning the eyes toward a page, tilting a page to bring it into better light, or turning up the volume of a phonograph. Moreover, under experimental conditions, a specific response can be reinforced by the production or clarification of a stimulus which controls other behavior. The matter is of considerable practical importance. How, for example, can a radar operator or other “lookout” be kept alert? The answer is: by reinforcing his looking behavior.

2012-01SkinnerF18.jpgClick to Enlarge ImageHolland has studied such reinforcement in the following way. His human subject is seated in a small room before a dial. The pointer on the dial occasionally deviates from zero, and the subject’s task is to restore it by pressing a button. The room is dark, and the subject can see the dial only by pressing another button which flashes a light for a fraction of a second. Pressing the second button is, then, an act which presents to the subject a stimulus which is important because it controls the behavior of restoring the pointer to zero.

2012-01SkinnerF19.jpgClick to Enlarge ImageHolland has only to schedule the deviations of the pointer to produce changes in the rate of flashing the light comparable to the performances of lower organisms under comparable schedules. In Figure 18, for example, the upper curve shows a pigeon’s performance on a fairly short fixed-interval. Each interval shows a rather irregular curvature as the rate passes from a low value after reinforcement to a high, fairly constant, terminal rate. In the lower part of the figure is one of Holland’s curves obtained when the pointer deflected from zero every three minutes. After a few hours of exposure to these conditions, the subject flashed the light (“looked at the pointer”) only infrequently just after a deflection, but as the interval passed, his rate accelerated, sometimes smoothly, sometimes abruptly, to a fairly constant terminal rate. (An interesting feature of this curve is the tendency to “run through” the reinforcement and to continue at a high rate for a few seconds after reinforcement before dropping to the low rate from which the terminal rate then emerges. Examples of this are seen at a, b, and c. Examples in the case of the pigeon are also seen at d and e. In their study of schedules, Ferster and the writer had investigated this effect in detail long before the human curves were obtained.)

2012-01SkinnerF19.jpgClick to Enlarge Image2012-01SkinnerF20.jpgClick to Enlarge ImageOther experiments on human subjects have been conducted in the field of psychotic behavior. In a project at the Behavior Research Laboratories of the Metropolitan State Hospital, in Waltham, Massachusetts, a psychotic subject spends one or more hours each day in a small room containing a chair and an instrument panel as seen in Figure 19. At the right of the instrument board is a small compartment (a) into which reinforcers (candy, cigarettes, coins) are dropped by an appropriate magazine. The board contains a plunger (b), similar to that of a vending machine. The controlling equipment behind a series of such rooms is shown in Figure 20. Along the wall at the left, as at a, are seen four magazines, which can be loaded with various objects. Also seen are periscopes (as at b) through which the rooms can be observed through one-way lenses. At the right are cumulative recorders (as at c) and behind them panels bearing the controlling equipment which arranges schedules.

2012-01SkinnerF21.jpgClick to Enlarge ImageIt has been found that even deteriorated psychotics of long standing can, through proper reinforcement, be induced to pull a plunger for a variety of reinforcers during substantial daily experimental sessions and for long periods of time. Schedules of reinforcement have the expected effects, but the fact that these organisms are sick is also apparent. In Figure 21, for example, the record at A shows a “normal” human performance on a variable-interval schedule where the subject (a hospital attendant) is reinforced with nickels on an average of once per minute. A straight line, similar to the records of the pigeon and chimpanzee in Figure 3, is obtained. Records B, C, and D are the performances of three psychotics on the same schedule working for the same reinforcers. Behavior is sustained during the session (as it is during many sessions for long periods of time), but there are marked deviations from straight lines. Periods of exceptionally rapid responding alternate with pauses or periods at a very low rate.

2012-01SkinnerF22.jpgClick to Enlarge ImageThat a schedule is nevertheless effective in producing a characteristic performance is shown by Figure 22. A fixed-ratio performance given by a pigeon under conditions in which there is substantial pausing after reinforcement is shown at A. In spite of the pauses, the general rule holds: as soon as responding begins, the whole ratio is quickly run off. Fixed-ratio curves for two psychotic subjects, both severely ill, are shown at B and C. Only small ratios can be sustained (40 and 20, respectively), and pauses follow all reinforcements. Nevertheless, the performance is clearly the result of a ratio schedule: once responding begins, the complete ratio is run off.




comments powered by Disqus
 

EMAIL TO A FRIEND :

Of Possible Interest

Letters to the Editors: Royal Society Misquoted

Feature Article: Quietest Places in the World

Engineering: Aspirants, Apprentices, and Student Engineers

Subscribe to American Scientist