 So, now let us look at some of the important results in probability called converges in weak convergence and strong converges. So, did we already define what we mean by IID sequence or we have not formally defined it? We have said it right what we mean by IID sequence independent and identically distributed. We when we have a sequence of random variables, we actually did this when we are trying to define joint distributions of a sequence of random variables right. So, what we mean by independent that so if I say that if I have a sequence of I have a random process I am going to say this is to be IID if each of my random variable is independent of others and all the random variable in the sequence have the same distribution okay. This is as simple as that okay. Now, let us talk about then we have the following results. So, let Xn greater than n greater than or equals to 1 is such that expectation of Xn is finite for all n and then we are going to define Sn to be summation of Xi is i running from 1 to n divided by n. So, this is the average of the first sum of this is the average of first n terms okay. Then so let us say right now I have a this given sequence of random variables where all of them have the same mean that is the expectation of Xn is going to be m for all n and this m is assumed to be finite that is mean of all this random variable is finite and I have defined this Xn which is the average of the first n random variable. Then we are going to say that if further each of my random variance of my random variables is bounded okay and their covariance of any pair of random variables is 0 then this Sn by n sorry Sn so sorry I have to be careful here I have to I am going to define Sn to be simply Xi I have to 1 to n. So, this is just the sum of the first n random variable and if I am going to look at their average Sn by n that converges to this m in the mean squared sense. They are upon bound they are bounded by the same constant okay they are uniformly bounded but need not be same and we are saying their pair wise covariance is 0 what does this mean okay so they say uncorrelation we are saying they are uncorrelated that means they are pair wise independent it is not that they are independent this sequence in this bullet A we are not saying that this sequence Xi are independent okay we are only saying that they have the same mean and their variances are bounded. And the second point says that if this sequences are IID right in that case what they are all independent in that case they have to be have the same necessarily the same mean and also they necessarily have to the same variance if that is the case then they converts in in probability to the same value mean and one can show a stronger result under the same IID assumption that this sequence convergence in almost sure sense also you know that almost sure is a stronger notion than conversion probability right. So, if you can show C that already implies B okay so this part the B part often called as B claw of large numbers and the last part is called strong law of large numbers okay let us try to understand the first part why that is true. So, if I want to show that Sn by n goes to m in the mean squared sense what I need to show I need to show that of Sn by n by m this goes to 0 if this goes to 0 then I can argue that Sn by n converges to m in the mean squared sense okay. So, what is the mean of Sn by n okay before that what is the mean of S of n Sn it is going to be n times m right because I can just this is nothing but the expect some of the n expectations where each one is value m. So, the expected value of Sn minus n is already m right. So, this is nothing but then in this case this is nothing but variance of Sn by n right. So, if this is variance of n I know that this is nothing but 1 by n square of variance of Sn right and I already know that this we can write it as 1 by n square this is covariance of Sn with itself correct when we say that variance is nothing of a random variable I think but it is covariance with itself. Now, if you expand this so Sn is nothing but the sum of n terms here this is also sum of n term and we know how to find the expect covariance of such terms right. So, if you are going to expand this what you are going to get is we did this exercise I think right sometime back the class that value turns out to be simply covariance of xi minus xj and we know that when i and j are different we are going to use this uncorrelatedness property and because of that if you simplify this this is nothing but covariance of xi I want to n and now let us sorry covariance of xi with itself this is nothing but its variance right and what we said variance of xi is upper bounded by c we are going to make use of this assumption that we are making. So, if you do this what is this going to be? So, c by n and this c we are assumed to be finite right. So, if I let n go to infinity this go to 0. So, now is this true that this guy Sn by n converges in mean squared sense. So, that is what we have exactly proved that we have showed that as n goes to infinity this goes to 0. So, fine as long as my variances are bounded we have the same means and pair wise my random variables are independent the convergence in mean squared sense. What about this part? How to prove b? Can we say can we say something about b using a part a? So, when does convergence in mean squared sense implies convergence in probability always implies right, but now I am assuming I am trying to state this b for the case when my sequence is IID right. So, if my sequence is IID I know that this already holds this already holds because all of them have the same distribution. So, their means are also necessarily same what is not guaranteed is though whether the variance is going to be uniformly they are all going to be same right, but we do not know where they are going to be finite they could be infinity as well right variance. So, if in this case for this case b if I make an extra assumption that in my IIDs they are IID, but further I assume that their variance are finite then we are done right then part a already automatically implies. So, part b if variance finite for all i then Sn by n converges to m in probability. We just argued that in this case yes if my sequence is IID in addition if I assume that my variances are finite all these the things in part a are already proven all these conditions in all the assumptions in part a already holds right. Then under this assumption we have shown that it converges in mean squared sense and we already know that if it converges in mean squared sense that automatically implies converging probability. Then comes the case what about if this variances are not necessarily bounded they could be infinity. So, that needs a bit more analysis. So, we will skip it actually that can be done we will do a proof later which will have a similar flavor that you can use to prove this part also. Then comes the last part we want to now prove a stronger theorem that this sequence Sn by n converges in mean squared sense. So, again this case I mean it is to prove it in general in a very generality it is going to be difficult we are going to argue it like the way we did it for part b under a restricted case. To prove this now we are going to assume that my fourth moments are finite. So, I am only writing the fourth moment of X1 is finite that means the fourth moment of all the random variables is going to be finite right because I am assuming an IID sequence. So, we already I think sometime discussed if we know that the fourth moment is finite then all the moments lower than this are also finite fine. So, if we can show this under that assumption that already implies part b right because the part b only required variance to be finite which is a second order statistics, but whereas I am assuming a bound on the fourth order moment right. So, if I assume this the second order moment already finite. So, that is already that is what I have used to show a convergence in probability. Now, let us see how if I assume this why is that convergence in almost sure sense is true again. So, this is where the proof of monotone converges comes into picture how it comes first thing to note is Sn which is the sum of all the random first n random variables if I am going to take the fourth moment and if I am going if I apply further my IID assumption on that it simplifies to. So, okay and also I am going to assume that expectation of X1 is finite. So, I am going to like to simplify the argument we are going to make this to assumption is this fourth moment is finite second is the second the mean is going to be 0. If I say X1 is 0 that means mean of all random variables is 0 right because this is an IID sequence. So, we are proving it under this IID assumption when X1 of X1 is not equals to 0 this expression becomes too long. So, just to get rid of see we have this X4 term X2 term then comes the joint terms X1 X2 all pairs and all. So, we have to write too many terms if expectation of X1 is 0 that simplifies and we get this okay. Now, I am going to define why to be a sum of sum by n but raise to the power 4. So, what is this? So, this is how to interpret this this you have to interpret as limit as m goes to infinity summation n 1 to n you understand what is the limit of the series right it is not a sequence here it is a series here okay and that is like as m goes to infinity and we have summing m and we are getting. Now, why this y is well defined why this I can define like this this is a limit right I can write a limit when this exist. Now, notice that Sn by n I am raising to the even power. So, all these terms are going to be non negative okay. So, because of that if you are going to look this. So, let us let us take this as y m. So, let us say this is now a sequence y m y m is defined like this is this sequence y m is monotonically increasing yes right because it is adding more and more positive or non negative terms and that is true for any omega you take a omega for that point and if you do this. So, for each omega I have a monotonically increasing sequence. So, it will converge maybe to infinity but it will converge. So, that is what that limit I am going to that limiting value I am going to take it as y fine. Now, I have by defining my y as the limit of this y m's and the way we have defined it is true that y m of omega is going to be greater than y m plus 1 omega right this is monotonically increasing sequence. If that is the case can I invoke my monotone converges theorem and can apply expectation on both sides and interchange the limits. So, what I want to do is I want to do is expectation of y is nothing but expectation of this quantity but now this is a limit because I have argued that this is a monotonically increasing sequence I have the liberty to interchange my limit and expectation. So, what we just said is expectation of y is nothing but fine I mean I am just I am doing nothing I am just playing with the definitions that have used and that is fine if you guys have lost already what are the definitions we are using. So, but just see that what are the things that we are manipulating here. Now I am going to use my Sn relation from this. So, if I am going to do that the expression I get is now here what I am going to invoke the condition that this finite this fourth moment is going to be finite. So, I have invoke this condition that expectation of X1 is 0 when I wrote this expression here. So, I had this expectation of X1 is not equals to 0 there would have been many more terms. Thanks to this assumption I could only write it simply like this and now I am going to assume. So, what I have assumed is this expectation of the fourth moment is sorry this fourth moment is finite that also meant that this second moment is finite this guy is finite here this guy is finite here and this guy here is at most n square and this guy is n but on the denominator I have n to the power 4. You can verify that this series is going to. So, just let us say I also do not know what is the limit but let us write it. So, this is going to be expectation of X1 4 divided by n cube plus. So, this is going to be what 3 1 by n square minus 1 by n cube in the expectation of X1 square 2 and I do not think this is not a sequence like this is the sum. So, this is going to be a series we have. So, this sum is going to converge to some finite value. You can check that because I know that 1 by n square summation 1 by n square converges right. So, summation 1 by n cube should definitely converge then and similarly this 1 by n square converges and this 1 by n cube also converges their difference is also going to be some finite. So, whole of this quantity here this series is going to be finite. I do not know what is that quantity but what we are trying to argue is that this expectation is going to be finite. So, if this expectation is finite we know that probability that y is finite is going to be happen with probability y. This says whatever the condition we have we have established expectation of y is finite. If expectation of y is finite it must be the case that that y takes finite value with probability 1. So, this is always true if expectation of y is finite then this is true but the other way is not always true. So, let us quickly complete this. Now the last point we need to argue is how the way we have defined y we have defined it like this right series. If summation of a n converges, so let us if a n converges let us say a n converges to some quantity a what will a n look like? Where the a n will converge to? It is going to converge to 0 right. So, by this we know that my Sn by n to the power 4 is going to converge to 0 then I can take the fourth root of that and can also argue that Sn by n converges to 0 for all omega. I have not yet said that right like. So, if y is going to be finite with probability 1 right then Sn by n that that sequence is going to converge to 0. So, now we have to argue that this convergence implies the almost sure convergence. Why is that? Because y being finite happen with probability 1 right and that is what we are using this implies this. If this is already probability 1 what can be the probability of this? It should be 1 right then what does what is the what in what sense it converges? It converges in almost sure sense ok. So, is that clear like if this is y is less than infinity it must be the case that this sequence should converge this event should happen and this should event should happen with probability 1. So, this is the standard result right in analysis why not? So, just you take the contrapositive case suppose a n does not converge just to anything it diverges ok. That means, a n's are going to infinity then do you expect this to converge at all the series nowhere right? Yeah, it can converge to infinity, but let us say we are talking about we have a case where this is finite right yeah right. So, it could converge to infinity, but we have specifically shown that this is a finite case that is not necessarily always the case right. For example, take this case a n is to be 1 by n summation 1 by n goes to infinity whereas, 1 by n goes to 0 ok. So, let us stop here.