Test
Download document

SKINNER, B.F.


Science and Human Behavior

…..

The term "learning" may profitably be saved in its traditional sense to describe the reassortment of responses in a complex situation. Terms for the process of stamping in may be borrowed from Pavlov's analysis of the conditioned reflex. Pavlov himself called all events which strengthened behavior "reinforcement" and all the resulting changes "conditioning." In the Pavlovian experiment, however, a reinforcer is paired with a stimulus; whereas in operant behavior it is contingent upon a response. Operant reinforcement is therefore a separate process and requires a separate analysis. In both cases, the strengthening of behavior which results from reinforcement is appropriately called "conditioning." In operant conditioning we "strengthen" an operant in the sense of making g. response more probable or, in actual fact, more frequent. In Pavlovian or "respondent" conditioning we simply increase the magnitude of the response elicited by the conditioned stimulus and shorten the time which elapses between stimulus and response. (We note, incidentally, that these two cases exhaust the possibilities: an organism is conditioned when a reinforcer [1] accompanies another stimulus or [2] follows upon the organism's own behavior. Any event which does neither has no effect in changing a probability of response.) In the pigeon experiment, then, food is the reinforcer and presenting food when a response is emitted is the reinforcement. The operant is defined by the property upon which reinforcement is contingent— the height to which the head must be raised. The change in frequency with which the head is lifted to this height is the process of operant conditioning .

…..

It is important to distinguish between schedules which are arranged by a system outside the organism and those which are controlled by the behavior itself. An example of the first is a schedule of reinforcement which is determined by a clock —as when we reinforce a pigeon every five minutes, allowing all intervening responses to go unreinforced. An example of the second is a schedule in which a response is reinforced after a certain number of responses have been emitted— as when we reinforce every fiftieth response the pigeon makes. The cases are similar in the sense that we reinforce intermittently in both, but subtle differences in the contingencies lead to very different results, often of great practical significance.

…..