 All right, more factors, the schedule of reinforcement. So this is, we're gonna look at the continuous schedules and the fixed ratio schedules and a variable interval schedule, all right? So what I want you to notice here is that the difference in terms of the extinction burst and the rate at which the behavior goes away. So the idea behind the rate is the steepness of that extinction line. So the idea here, we've got behavior on extinction. So probability of response is at one level at baseline. We're gonna put it on extinction. Notice that it bumps up a little bit, not much, just a little bit, and it drops off really quick to near zero and stays there, right? That is, I mean, that's what you see with a continuous reinforcement schedule, so in a machine. Every time you press the button, after you put in your money, you get a reinforcement. Press the button, get a reinforcement. Press the button, get a reinforcement, right? That's pretty standard. But if we use something a little bit different, if we use a fixed ratio schedule, which is on average, sorry, not average, but after a fixed number of responses, you will then get a reinforcer. So we could set up a pop machine to be an FR-5 to where that you'd have to hit that button five times before it delivers a pop, right? And you only put in your dollar 50, but you gotta press the button five times, okay? So we're gonna press that button five times. Notice what it does to the extinction burst. It causes it to go bigger. And it also delays that, the extinction doesn't happen as fast. It's not as steep. And it's close, but it's not quite as steep, right? Now let's watch what happens with an intermittent, so with another type of intermittent schedule, the variable interval. Look at that extinction burst, right? So that orange line now, so that extinction burst is way up there, right? It's really high. Took a while to get it up there, and look at how long it takes for that behavior to go away, right? So variable intervals, which is on average, X-out behavior will be reinforced after a certain amount of time has passed. So if you see a variable interval or a variable ratio, those things are really resistant to extinction. If you see a fixed ratio or a fixed interval, they're not quite as resistant to extinction as a variable one, but they're more resistant than a continuous schedule. Continuous schedules, behavior's easy to extinguish. Now you might ask the question, why? Well, if we recall back to 324, we didn't, your book covered something called the PRE, the partial reinforcement extinction effect. In a nutshell, there's like five different hypotheses along that are associated with the PRE, but the one that I subscribe to, the one that I think is the most valid, is the one that says the person is unable to realize or the animal is unable to realize that they are on extinction because if they have been on a variable schedule of reinforcement in the past, or an intermittent schedule, I should say. So if they've been on an intermittent schedule in the past, their behavior isn't reinforced every time. So now when you start extinction, they're not gonna realize that they're no longer gonna get reinforced. Why? Because they're already, their behavior wasn't reinforced every single time. So it takes a while for an organism or person to realize that they're on extinction if their behavior has been a on intermittent reinforcement in the past. Continuous, you know right away. You know the moment that soda machine isn't working because it's not giving you your pop and your pissed. So with continuous reinforcement, there is no PRE. There's no partial reinforcement extinction effect because it wasn't partially reinforced. It was continuously reinforced. And again, the one that I subscribe to is one of the people just basically not realizing not able to detect that they're on extinction. That's kind of what I feel about it.