University of Idaho Psychology of Learning
Lesson 4.1: Lecture 1 Transcript
 
Home
Syllabus
Schedule
Contact
Help

 

Department of Psychology

  ©
 
University of Idaho
  All rights reserved.

  Psychology Dept.
  University of Idaho

 


 

 


 

 

Back  
Transcript of Audio Lecture
In the last sections we’ve been examining classical and instrumental conditioning. In this section we begin with a discussion of operant conditioning and the variables that will impact it. So, let’s begin by discussing some differences between operant and classical and operant and instrumental conditioning.

As we see in slide two, in classical conditioning, the focus was on the relationship between two stimuli, the conditioned stimulus and the unconditioned stimulus. In instrumental conditioning, the focus shifted to how the stimulus affected some particular type of response. Now, both of these are different from operant conditioning. In operant conditioning, it’s not the stimulus before the response that’s important. In operant conditioning what follows the response is the most important. That is, the consequent stimulus. So in this case, we have a response followed by some consequence stimulus. We abbreviate that as either S or Sc. Thus, if you put it all together, we have some stimulus that causes a particular response. That, in turn, is followed by some consequent stimulus. So, in some ways, you have a S-R-S type of model.

Let’s discuss some differences between instrumental and operant conditioning. These differences really go back to some very basic things that we discussed in instrumental conditioning. Primarily going back with Thorndike and others and begins in slide three.

In instrumental conditioning, the environment constrains the opportunity for a particular reward. That is, a specific behavior is required for a reward and that’s true. However, in operant conditioning, a specific response is required for reinforcement. That is, the rewards that follow the particular response and the frequency of responding determinee the amount of reinforcement that’s given. In many ways, this relates to the type of methodology that we use to examine instrumental versus operant conditioning. For example, in instrumental conditioning, basically what we usually often used was the T maze. So, you would have a T type of shape where you would have food or something at one end. You would then take the rat, put it at the top of the T. The rat would then run down the maze and come to a junction and then it would turn right or left to get to where the food was.

In operant conditioning, the system was a little different in that the organism was placed in what we call an operant chamber (Skinner Box) that we’ll describe in more detail later. In essence what happens is that the organism is inside the box pressing some particular key or bar. When it presses it gets some kind of food. So as a result, there is no running, having to pick the animal up at the end of the T and putting it back. Instead, the animal stays within the operant box and then makes particular sets of responses.

Now, let’s talk about the major person who was really important within operant conditioning. While some individuals classify Thorndike as an operant conditioning psychologist, and he had a major impact in it, Thorndike was not really the person who developed operant conditioning. The person who really makes the impact in what we will call operant conditioning was B. F. Skinner. From his research and writings, B. F. Skinner develops a form of behaviorism that's called Radical Behaviorism. Not because it was so extreme, but in the sense of its importance and what aspect of variables it examined.

Now, a little history. Skinner is probably the most important psychologist in the applied area that’s ever lived. In fact, his principles of operant conditioning and others have basically been used in everything. They’re used in medicine, they’re used in education, they’re used in therapy, they’re used in business. Ultimately, his underlying principles will have major, major impacts in all of psychology and in many, many other fields as well.

Skinner distinguished between two general types of responses and are shown in slide five. He described them as respondents and operants. So, let’s talk about each of these for a second and examine respondents first.

Respondents are elicited by some kind of unconditioned stimulus. Respondents in essence are basically responses that are innate. They’re regulated by the autonomic nervous system, for example, your heart rate, your blood pressure, drops of saliva, etc. They’re also involuntary, that is, you have no control over them. And finally they’re classically conditioned.

So what’s the difference between respondents and operants. Well, for Skinner, as we see in slide seven, operants are emitted, they are in essence skeletal in nature and they are voluntary. As a result you get lots and lots of feedback.

Now Skinner in his studies systematically demonstrated several things which we’ll talk about in considerable detail over the next few sections. The first thing he demonstrated was this. If something occurs after a response, that is some kind of consequence stimulus is given, and the behavior increases, the procedure is called reinforcement and the thing that actually causes the particular increase is called the reinforcer. So, if the procedure that you’re doing, (whatever that procedure may be) causes the behavior to go up, the procedure is in essence a reinforcement procedure. The things that you give, (the chocolate chip cookies, the money, or whatever it may be), are called reinforcers.

In addition, there is a second thing that we have to discuss and is shown on slide nine. This relates to decreases in behavior. That is, if something follows a response, that is a consequent stimulus and the behavior goes down, that procedure is called punishment and the thing that causes the particular decrease is called a punisher. So, again, the organism makes some kind of response and it goes down. The overall procedure such as time out, spanking or whatever is called punishment and the actual thing that causes the decrease, the electric shock, the smack on the butt, or whatever are called punishers.

So says Skinner, as we see in slide 10, reinforcers always increase the behavior and punishers always decrease a behavior. There are no exceptions to the rule. So, if you have a behavior that’s going up, you are automatically using some kind of reinforcement procedure and giving some kind of reinforcer, even though it may seem intuitively the opposite.

Finally, Skinner defines two types of reinforcers and punishers and these are shown in slide 11. Basically the difference between the types of reinforcers that we have is related to whether you add or remove something. What Skinner does is kind of use an electrical model. That’s what you need to think about to make sure you stay on track. For example, if you add something following a response, that is positive and if you remove something following a response, that is negative. So the concepts of addition and removal, ala positive and negative, are related to whether you’re adding or removing something. Thus, as we see at the bottom of the slide, positive does not mean good and negative does not mean bad, all it does is demonstrate that you are either adding or removing something.

Now in the next sections we will go into much more detail of what we refer to as a appetitive and aversive conditioning, and so until then, we will hope that you have a great day.

 

 

 


Back