 Hi everyone, it's MJ and we are returning to compound distributions, although in this situation what we're going to be doing is using the moment generating function to help us calculate the mean and the variance of these kind of distributions. Let me maybe use a darker blue to get into it. So a very, very quick recap of the moment generating functions. What we know is that if we took the derivative of these things, we got the following. And we said t is equal to zero, we could then get the moments of x. What we also found was that we had, not only do we have the moment generating function, if we took the logs of it we could get something called the cumulative generating function, which is another way of writing log m of t, and we found that when we set this equal to zero we got the moments, but this time the moments around the mean. So moment generating function gave us the means around zero, whereas your cumulative generating function is giving us the moment about the mean. And another way of denoting this is with c. And remember, I mean, we came up with these really cool results where straight away you take the second derivative, set it to zero, and you've got the variance. You take the third derivative, you set it to zero, and you've got skewness. And it's an amazing way to almost leapfrog all the mathematics and cut straight to the chase. So that is a very quick recap on moment generating functions. There is an entire course on it. So if you have questions, please ask me there if you have any difficulty understanding this type of stuff. But what we're going to be doing now is using it with our compound distributions to get our mean and variance. So without further ado, let's jump straight into it. So the moment generating function of our compound distribution is going to be equal to the following. I mean, we get this simply from the definition of the moment generating function where things do start getting interesting is we know that we can take this value s and we can expand it using conditional probabilities. So we can have something as follows and then given n. Okay. So where did I get that from? Well, I got that from the previous video, which got it from the even previous video. So make sure you're watching that in order to understand how I just did that, because you can see we proved that the expectation of y can be expanded into that and we've done exactly that for s. Okay. But let's spend a little bit of time on this value here. So we're looking at the expectation of the exponential t of s given n. And what we're simply going to do is now expand this further. I'm going to use this notation exp, but I mean exp is the same as writing e with some information at the top. It just makes it a little bit neater. So we're going to open up our bracket. We have our t and we're going to take this s and we're going to expand it into the x's. So we have x1 plus blah, blah, blah, plus xn close our brackets. And given in a situation when n is equal to n, we can see that if we just do a little bit of the algebra and we get this out. We've got x1 plus xn. Then we've dealt with that, which then gives us the following exponential tx, i. Let me maybe just take a quick step back. I mean, everybody I think is comfortable that xi is equal to x plus x plus x. This value here means this is x times x times x. So yeah, I'm not sure. I think at this level, most of us would have come across that. But anyway, then what we know is by multiplying out, this would be also by the n, we then get to a situation where, oh, look at that. That is also our definition for the moment generating function of x. So we have mx of t times n amount of times, which means if we come back to our ms of t, we are going to get the following, the expectations of this mx of t. And we have to do this because remember n is a random variable. We can't just leave it there because we don't know what it is. So we have to take the expectation of it. Once again, what we can do now is go back to our exponentials and flip it on its head, log mx of t. If you're wondering, wait, hold on, how did the log come in? Remember you've got stuff here with the logs and exponential, just the two, the opposite way around. So because we're raising something to the exponential, we have to counter that with the logging. Why we want to log is because when you log something to the power, you can push this all the way up to the front. So a little bit of mathematical cartwheels, but it's not that, I mean, it's not that that higher grade. That's something that you should have been comfortable with at school. Where it does get interesting is you can see this once again is now going to be in the form of a moment generating function, but this case in n, because the n is over there. So we're going to have mn instead of being t, we have log mx of t over there. So you can see this picture in the bracket is everything after our random variable and our random variable is what goes over there. And there, yeah, what we've done is we've calculated this thing over here and then we can take derivatives of it. And this is going to be a shortcut to calculating means and variances. Look, I have worked quite quickly through this video. And if you are confused, your confusion probably lives with moment generating functions. And like I said, there is an earlier course that deals with that. And so if you have questions around that, please ask them there. Because essentially what we're doing in here, the only difficult part is we're combining these results that we found there with our moment generating things, which we discussed over here. So make sure you go check out that video if this is still quite confusing to you. Anyway, I will be seeing you for future videos. So stay tuned and I'll see you then. Cheers.