The two major types of reinforcement schedules are continuous and intermittent. A continuous reinforcement schedule reinforces the desired behavior each and every time it is demonstrated. Take, for example, the case of someone who has historically had trouble arriving at work on time. Every time he is not tardy his manager might compliment him on his desirable behavior. In an intermittent schedule, on the other hand, not every instance of the desirable behavior is reinforced, but reinforcement is given often enough to make the behavior worth repeating. This latter schedule can be compared to the workings of slot machine, which people will continue to play even when they know that it is adjusted to give a considerable return to the casino. The intermittent payoffs occur just often enough to reinforce the behavior of slipping in coins and pulling the handle. Evidence indicates that the intermittent, or varied, form of reinforcement tends to promote more resistance to extinction than does the continuous form.
An intermittent reinforcement can be of a ratio or interval type. Ratio schedules depend on how many responses the subject makes. The individual is reinforced after giving a certain number of specific types of behavior. Interval schedules depend on how time has passed since the previous reinforcement. With interval schedules, the individual is reinforced on the first appropriate behavior after a particular time has elapsed. Reinforcement can also be classified as fixed or variable.
When rewards are spaced at uniform time intervals, the reinforcement schedule is of the fixed-interval type. The critical variable is time, and it is held constant. This is the predominant schedule for most salaried workers in North America. When you get your paycheck on a weekly, semi-monthly, monthly, or other predetermined time basis, youâ€™re rewarded on a fixed-interval reinforcement schedule.
If rewards are distributed in time so that reinforcements are unpredictable, the schedule is of the variable-interval type. When an instructor advises her class that pop quizzes will be given during the term (the exact number of which is unknown to the students) and the quizzes will account for 20 % of the term grade, she is using a variable-interval schedule. Similarly, a series of randomly timed unannounced visits to a company office by the corporate audit staff is an example of a variable-interval schedule.
In a fixed-ratio schedule, after a fixed or constant number of responses are given, a reward is initiated. For example, a piece-rate incentive plan is a fixed-ratio schedule; the employee receives a reward based on the number of work pieces generated. If the piece rate for a zipper rate for a zipper installer in a dress making factory is Rs. 120 a dozen, the reinforcement (money in this case) is fixed to the number of zippers sewn into garments. After every dozen is sewn in, the installer has earned another Rs. 120.
When the reward varies relative to the behavior of the individual, he or she is said to be reinforced on a variable-ratio schedule. Salespeople on commission are examples of individuals on such a reinforcement schedule. On some occasions, they may make a sale after only two calls on a potential customer. On other occasions, they might need to make 20 or more calls to secure a sale. The reward, then, is variable in relation to the number of successful calls the salesperson makes.