 This video will talk about exponential modeling. Here's our first example. The number of deaths per 100,000 women in the United States is given in this table. Let d of t be the number of deaths per 100,000 women in the United States, t years since 1900. So we have 1930, and it's the number of deaths, and so on. This is going to be my L1. And L1s are going to actually be 30s. Because it's 30 years since 1900. 40, 50, 60, 70, 80, and 90. And then these values will be my L2 values just like they are. So if I come into my calculator again, remember I hit stat and then edit to get to where I am here. And I put in my 30, 40, 50, 60, 70, 80, and 90. And then I arrow to L2. And I put in 28, and 21, and 13, and 9, 6, 5, 4. Now, I have to determine what kind of regression that I want to do here. Because it asks me to use regression. But if I look at this, it's definitely decreasing. But it's not decreasing by a lot. And in fact, I was having my son help me do this. And he says, mom, I know this is exponential. Because you start out with a big number, and you're ending with a real little number, and you didn't have a whole lot in between. So that was just one way that you might look at that pattern. I'm going to use an exponential. But if I didn't know that, in fact, let's go in here and do this. I didn't know that I could come in and clear my equations out so they don't get in my way. And come up and turn my plot on. And remember, we can do Zoom 9 to look at our data. And when we do that, now I can see that that definitely looks like an exponential curve, a decreasing exponential curve. So I am going to use stat over to calculate. And then if I arrow down here, once you know this, you don't have to arrow down every time anymore, 0 will give me my exponential regression, press Enter. And now I know that d of t is equal to, rounded into two decimal places, 74.21. That's my a times b, which will be 0.97 if I rounded two decimal places, to the t. So here I have my function written again just to remind us. Estimate the number of stomach cancers for 100,000 women in 2,000. Well, it was beginning, if you go back and read the problem again, it was since 1900. So in 2,000, we know that t, that's a year, so we know that t is going to be 100 years later. So we just plug and chug. In fact, if I put my equation in my calculator, 74.21 times 0.97 in parentheses, caret x. And then I go and say, second window, I can tell my table that I want to start at 100, so I can see when x is 100. And in my table, table says that we have 3.5289, or approximately 3.5300,000. So actually, it's going to be 353,890 deaths in 2,000. Now remember, the data that we had was back here, was 4, but that really meant 400,000. So when I write it like this, it's 3.53, but I'd have to say 100,000 times 100,000. Then it asked me, what would be a reasonable domain and range? Well, we know it started in 1900, so it's when it began. And we're going to go up to whenever we think it might be 0. And if you look at this, it's going to take a long time to get to 0. In fact, if you remember about exponential functions, they get real close to 0, but never get there. So I mean, we could really say just about anything we wanted to in here. But what happens if it's 200 years from now? And in 200 years, we're getting closer to 0. So maybe we're going to say, OK, we'll consider 200 years. So that would be a domain of 0 to 200. Some of you might have said, no, let's just go to today. So you would have said 0 to 112. That's good, too. I'm not real picky about what this. It just has to be reasonable. Infinity, we hope, isn't reasonable. We'd like to cancer so that we won't go forever and ever and ever. For our range, we know that it started out with 2,800,000. And we hope to get down to 0. In 200, we knew that we were at about 0.16. So if we really wanted to have these things related, we'd have to say, this is where it began, and this is 200 years from now, how many there were. So the deaths would be from 28. Again, that's 100,000 to 0.16. And we know that this is in 100,000s of deaths. All right, another example. Intensity of light decreases as you go deeper in water. The intensity for separate deaths is given in our table. And we're going to use regression to find a model. So let's go back to our stat and our edit. And we're going to have to clear everything out so that we can start fresh. So clear all that out. And we put in our L1. It's going to be depth, so 0, 1, 2, 3, 4, 5. And then we put in our Ys, which are L2, and the intensity. So we have 100% to start. And I missed something somewhere. All my 3 and my 4 went together. I would have gotten an error here, because I had more things in L2 than I did in L1. So I always double check that before I go look. And then I can see right here I did 34 instead of 3 and 4. So I'm just going to arrow over and back up, make this 3, and then Enter, 4, and then Enter, and then 5. And then I should be all lined up and ready to go. And if you look at this, it does look like it started out with a big number and ended with a very little number real quickly. So that should be exponential. But again, we could look at our data. And it definitely looks exponential. So stat, over to calculate, 0, remember, was exponential regression. There it is. So press Enter. And we have I of D is going to be equal to 98.09 times 0.25 to the D. Remember, these variables need a match. And we want to estimate the intensity at 6 meters. So this is a D. So we want to put in 98.09 times 0.25 to the 6. And if I put my equation in here and then second window again will let me tell it what x I want. I want it to be 6. And when we plug that in, we find out that it's 0.02. And remember, this is a percent light at 6 meters, at the 6 meter depth. Reasonable domain and range. I would say that our table looks pretty close to being a pretty good domain and range. We know it's going to be a little bit more than 5. If we look at our table here, if we really wanted to say 0, 0.0015 rounds pretty close to 0. If we go to 9, it's 0.7 times 10 to the negative force. And now we know it's getting closer and closer and closer. But again, we're never going to get to 0. So I'm going to say maybe 0 to 10 would be close. Would be reasonable. 0 to 10 meters for the domain and for my range. That means we went from 100. And we got pretty close to 0. Since I have a parentheses on here, it means I get close to 0, but I never get there. The following displays a number of Starbucks stores worldwide since 1991. And it's going to be important that since 1991. That means that in 1991, zero years have gone by. In 1993, two years have gone by. 95 would be four years. And 97, 6, 98, or 99 would be eight. 2001 would be 10. 2003 would be 12. So those are my L1s. And the number of stores here would be my L2. So I've already put that in my calculator. And now I'm ready to just go find the regression. So stat, over to calculate, and 0. Exponence regression. So I'm going to see that it's exponential because I've got a small number to a very large number very quickly. And when I find that exponential regression, I have 142.21 times 1.41 to the t. So what is the percentage growth rate of Starbucks stores? We haven't talked about percentage growth rate. But the growth rate is going to be that base of 1.41. And we're going to subtract 1. And that would give me that growth rate because this is bigger than 1. So if I take 1.41 and I subtract 1 from it, I'm going to end up with 0.41. Or that's the same thing as 41%. So that's my growth rate. If this had been a decay, I would have had 1 minus the b. This b minus 1 would have been switched around. And it would be 1 minus b so that I can get a number. Because I start with a number that's smaller than 1. If I subtract that number from 1, I should be able to find my percentage. That would be a decay. So now I want to estimate the number of stores in 2010. Well, that tells me that t, we started in 1991. So t is going to be 19. So now I just have to plug and chug. And I'm going to go to my calculator to tell it, let my table do the plug and chug for me. So I'm going to start at 19. I've already put my equation in there. So I have 97,296 stores.