Definitions | Types of Schedule | Reinforcers and Reinforcements | Multiple Choice |
---|---|---|---|
Discrimination
"Learning that some responses, but not others, will be reinforced"
|
Variable-Ratio Schedule
in operant conditioning, a reinforcement schedule that reinforces a response after an unpredictable number of responses
|
Positive Reinforcements
any stimulus that, when presented after a response, strengthens the response; increases behaviors
|
a. cause a behavior to stop.
The purpose of reinforcement is to
a. cause a behavior to stop. b. cause a behavior to diminish. c. cause a behavior to continue. d. strengthen the spontaneous recovery process. e. cause a behavior to occur for only a limited amount of time. |
Generalization
"Responses learned in one situation occurring in other, similar situations"
|
Fixed-Interval Schedule
in operant conditioning, a reinforcement schedule that reinforces a response only after a specified time has elapsed
|
Negative Reinforcements
any stimulus that, when removed after a response, strengthens the response; stops or reduces stimuli
|
d. Charles smokes because his anxiety is reduced when he does so.
Which of the following best describes negative reinforcement?
a. John stops shooting bad free-throws because his coach benches him when he does. b. Brian studies hard because it earns him “A” grades in math. c. Lillian used to walk to school but does not do so anymore because she was attacked by a dog last month. d. Charles smokes because his anxiety is reduced when he does so. e. Osel wears his seat belt because his driving teacher cited accident statistics in class. |
Spontaneous Recovery
"The reappearance, after a rest period, of an extinguished response"
|
Variable-Interval Schedule
in operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals
|
Primary Reinforcers
an innately reinforcing stimulus, such as one that satisfies a biological need
|
a. Law of effect
Thorndike’s principle that behaviors followed by favorable consequences become more likely to be repeated is known as what?
a. Law of effect b. Operant conditioning c. Shaping d. Respondent behavior e. Discrimination |
Extinction
Responding decreases when reinforcement stops
|
Fixed-Ratio Schedule
in operant conditioning, a reinforcement schedule that reinforces a response only after a specified number of responses
|
Conditioned Reinforcers
a stimulus that gains its reinforcing power through its association with a primary reinforcer; also known as a secondary reinforcer
|
c. high score on an exam for which a student studied diligently.
All of the following are examples of primary reinforcers except a
a. rat’s food reward in a Skinner box. b. cold drink on a hot day. c. high score on an exam for which a student studied diligently. d. hug from a loved one. e. large meal following an extended time without food. |
Acquisition
Associating a response with a consequence (reinforcer or punisher)
|
Partial Reinforcement Schedule
reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous
reinforcement |
Delayed Reinforcers
reinforcement that does not occur immediately after a response has been made
|
b. Variable-ratio
Shea bought 10 tickets for the raffle for free homecoming entry, but she did not win. Months later she also buys 10 tickets for the senior prom raffle, hoping this will be the time she wins. Which schedule of reinforcement is best used to explain this scenario?
a. Fixed-ratio b. Variable-ratio c. Fixed-interval d. Variable-interval e. Continuous |