 So, yes, we have the sequence of random variables converging to another random variable in some different notions. It is not always necessary that if you have accents which have some distributions and the limiting distributions of the random variable limit random variable need not be always the same or even need not be the of the same nature. For example, let us say if I have a sequence of random variables which are all exponential distributed or Poisson distributed something and not necessary that the limit distribution is also of the same nature. It is also either exponential or Poisson. It could be something different but it so happens that when we have a sequence of random variables where each of the random variable is Gaussian, the limit distribution is also always Gaussian. Maybe the parameters will change but at least the nature of the limit distribution is still a Gaussian random variable. So, I will just. So, if N goes to infinity. Yeah. It becomes Poisson distributed. Right. So, it change, right? That is the. Yeah, that is an good example. So, where each of the random variable could be binomial but the limiting could be something different not necessarily binomial. It has changed its characterization. So, but if all the random variables happens to be Gaussian then the limiting distribution is necessarily again Gaussian. Maybe as I said the parameters could be different the mean and the variance could be different but it is still going to be Gaussian distribution. Okay, next. Whatever we discussed so far, thanks for fine. Like to define these convergences I always define with respect to the limit, right? Like I said xn converges to x in mean squared probability or whatever. But is the knowing the limit random variable is necessary to just define convergence? So, for example, let us forget the random variable and all probability just focus on our simple deterministic sequences. I know xn, let us say I have a sequence of an's which converges to some a that is fine. But just looking without knowing the limit can I say my sequence an converges? Is it necessary that always I need to know the limit before to tell what whether it converges or not. So, how people heard about Cauchy criteria Cauchy sequence or Cauchy criteria what Cauchy criteria says that does that involve limit? No, right? So, to define limit whether the sequence converges I really always need to know the limit, limiting value. It is enough if the sequence itself satisfies properties if the sequence satisfies some properties I can say it converges maybe I do not know what is the limit. But all I can guarantee is that it converges to something. So, same can be done here to a sequence of random variables. You may most of the times it may be difficult to guess what is the limiting random variable or what is the limiting distribution. But your sequence may be such that it is amenable to check some conditions which will tell at least it converges. What it converges is a later headache. First let us worry about whether it converges and then once we convince ourselves as it converges maybe we can think about what is the limit it converges. So, for that we have similar notion of Cauchy criteria that converges in this with a sequence of random variable. So, in our classic setting or with in the deterministic setting let us say if I have a sequence of a n's what is the Cauchy criteria for convergence here? Let us say for all epsilon greater than 0 there exist n epsilon such that for all mn n epsilon right. So, if this condition happens I know that the sequence converges it is just that I do not know the limit what is that. Now we are going to state simply all the four notions of convergence we have in terms of this Cauchy convergence criteria. I am just going to state that as result. So, Xn converges almost surely here is the same as testing this kind here. This is equivalent of Cauchy criteria for our random variables. Oh sorry it should be outside this is incorrect. So, how to write this just a moment, just a moment. So, this is exactly analog version of the Cauchy criteria for my almost sure converges. So, earlier I needed to know the limiting random variable X but I do not worry about that here in the Cauchy criteria what I look is take a pair of random variables and let them go to infinity and if that goes to 0 and the probability of all this omegas that satisfy this condition is 1 and I am going to call this or it converges almost surely, but I am not specifying which random variable it is converging it converges to something. Similarly, convergence in probabilities again defined like this. So, you see what we have basically done is earlier Xn we had X the limit right now that X is replaced by Xm again another point in the sequence and same thing here also we are going to look at Xn minus Xm the difference being greater than epsilon and that goes to 0 then we are that if that limit goes to 0 then we are going to call that conversion probability to some random variable. So, same thing for conversion means squared sense also. So, notice that when I say convincing mean squared sense it automatically assumes that my Xn's are such that their second moment is bounded less than or equal to 0. So, that is already implied in this condition C. So, we will just skip the proof of this it is just analog to how you prove in the deterministic sequence how the Cauchy criteria implies the convergence is same thing, but you have to worry about all these constructing the sets epsilon delta business and also changing their limits appropriately. So, you can look into the book for details. Here we cannot that is the cray right in the in the. So, the when you go from almost sure to P we could. So, what do you mean by interchanging probability and the limit here? No, we cannot of usually for to do to do that we have stated a specific condition like if you are looking at limit as n tends to infinity probability of Bn right. If the sequence Bn are such that they are monotonically increasing then I can interchange that was what we call as continuity of probability. If I do not have such a structure on the events that is Bn's in general I cannot do this you have to be careful there. So, it is not in general true that we can interchange limit and probability you may also face a case with where you have to interchange limit and expectation. So, you cannot do this. So, for when we can interchange limit and probability we already stated we are going to state later in the next class when we can interchange limit and expectation. This can be done only under certain criteria and under some conditions. So, fine we have for this convergence of random variables we have this crochet criteria now. Now, is it possible that we can similar criteria if we can if we know something about the correlation of the random variables. So, I have a sequence of random variables right I can look into their correlations basically take a pair Xn n and m I can look at their correlation Xn and Xn into Xm take the product and then look at their expectation that is the correlation for us right. So, the next result says that yes there is some connection like that. So, this is called correlation version of crochet criteria for the squared convergence okay the statement is. So, what now it says is let us take a sequence of random variables which are finite second moment then there exists an X which is the limit in of the sequence in the mean squared sense if and only if this limit converges now. Now, what is this sequence this is the correlation sequence if this correlations can exist and it has some finite value then this is true okay. Now, let us see why this is true we will quickly argue this. So, now let us prove first if part. So, do you see the difference between part C here and this result here. So, part C here said that Xn converges if this Cauchy criteria holds, but now this is replacing this Cauchy criteria by this correlation criteria where you are now testing something on the correlations of the sequences okay. So, if so let us say this limit as mn tends to infinity is some value C and that is finite. So, this is the hypothesis right like it is exist and finite let us assume that to be some C and it is a finite value then what we will do is just let us apply this condition here and then this is if you just expand this you are going to get Xn square minus this is for any nm and minus plus expectation of Xm square. Now, if I want to argue that this guy converges in mean squared sense I need to show that as nm tends to infinity this quantity goes to 0 right that is the definition of my mean squared converges. Now, I know that if I let nm go to infinity this expectation go to infinity this product correlation term that is my hypothesis what I can say about this Xn square and Xm square. So, in this limit when I am letting nm go to infinity right I could as well set n equals to m and let that limit go to infinity right in that case is this this this guy will have the same limit as this. So, you can just check that they will have the same limit in this case as nm go to infinity I can just verify that this is 2 minus 2C plus 2C and this is going to be 0. So, that is why now I am talking about here expectation of Xn square. So, fine see there are two things here you are not setting n equals to m here when you are letting nm go to infinity you can arbitrarily let nm go to infinity n and m could be different, but now this consists of Xn square and Xm square now I am just talking about how this limit itself will go and that I am inferring from this. So, if this is going to be C even if I let n equals to m and both got infinity thus limit is also going to be C that is what I am using C for these two. So, what we just said is from this if this correlation criteria holds this is same as verifying this Cauchy criteria. So, that is this Cauchy criteria here we know that this if we have this Cauchy criteria already know that that converges in mean squared sense. So, if it converges in mean squared sense it has to converge to some random variable. That is why we are saying there exists some X such that this Xn has converge to that random variable we do not know what is this X, but there exists some X it has converged. Now, to prove the other direction we will assume that Xn converges to X in some to some random variable then we will try to show that in this case this correlation limit will be finite and it will be. So, this limit exists and it is going to be finite. When we try to show Xn square converges to X in mean squared sense our assumption was Xn square is going to be finite for all n. You can verify that this implies that whatever limit this guy is going to convert in the mean squared sense this limiting random variable will be such that even that is going to be finite. How you are going to do this? You are going to verify by applying triangular inequality of the random variables. Just apply this and you will check that you will get to you can infer that expectation of X square is also going to be finite. Which inequality of the random variable? Triangular inequality of the random variable. So, we talked about this when we talked about short inequality, right? When we talked about short inequality we talked about triangular inequality of the random variable. So, I think we talked if I am not just going to refer to that it is in the same section where caution equality is defined. So, by just applying that you should be able to infer this. Now, we know this we want to talk about this whether the limit exists. Now, let us take this quantity. Now, this is some algebraic manipulations we have to do and then again we will include triangular inequality to conclude that this is indeed true. So, this can be written as, so I will just cut short this and just you can just this is all algebra. Now, just by expanding you will get this. Now, what we are going to do is we are going to apply cautionary inequality in on each of these and try to derive a bound. So, let us try to apply cautionary on this. So, what did the cautionary inequality says? If I have two random variable product, what is the expectation is going to look like? It is upper bounded by in terms of the second moments, right? In what fashion? This was upper bounded by square root like what is the first term? Yeah. So, expectation of this into squared, yeah expectation of square of this into expectation of square of this. So, you can similarly apply short inequality to each of these terms, this term and this term here and you will see that as I let m go to infinity here, what is going to happen? R here. So, notice that this is x m minus x whole square. By the definition I have I am assuming that x n converges to x in a mean squared sense. So, if that is the case what is this going to? It is going to 0. So, this term is going to vanish and similarly what is this term is going to happen to this? It is going to vanish and what about this is going to be 0? What remains? Expectation of this, right? So, this is now we are saying that what we have shown is this limit here as m n go to infinity is equals to expectation of x square. So, we have now shown the existence. So, the limit indeed exists. What is that limit? That limit is expectation of x square and this x is what I have assumed that it converges to some x. And now the second part I have to show is it is finite. Why it is finite? Because I said that mean squared sense convergence also implies that the limit is going to be finite. So, just 1 or 2 minutes as a consequence of this theorem we can derive many corollaries which I am just going to state which are very useful. You should know how to use them. So, corollary if x n goes to x in mean squared sense and another sequence I have which goes to y in mean squared sense, then you can check that expectation of x n, y n goes to, this is analog version of our deterministic case, right? For example, if a n sequence converges to a and b n sequence converges to b, what does the product sequence a n, b n converges to a b, exactly similar thing we have here. And now then x n converges to x in the mean squared sense then we have expectation converges. So, mean squared convergence also implies that the convergence in expectation. So, why is this true? You can just say that you can take this y n's to be simply 1, all the case. In that case y n are converges to y. So, in that case this is already implying, right? This x n's, see expectation of x n is converging to expectation of x. So, these are all pretty useful results like, so you should know how to use them. So, for proof I am just skipping you can look into the book. So, let us stop here.