 Welcome back, we have discussed monotone convergence theorem, which says that if you have sequence of functions converging almost everywhere monotonically, then you can exchange limit and integration right integral of limit is same as limit of integration. Today we will discuss dominated convergence theorem, which gives another sufficient condition for you to interchange limit and integration. So, for the dominated convergence theorem essentially says that if you have a sequence of functions, which converges to another function not necessarily monotonically, but suppose that the sequence of functions is bounded above by another function, which has a finite integral, then you can interchange limit and integration. So, that is what this dominated convergence theorem says, it is a very useful theorem. In particular if you have a sequence of bounded functions, you can always interchange limit and integration. Before the dominated convergence theorem can be derived as a consequence of monotone, it is like a corollary of the monotone convergence theorem. So, in some sense the MCT is all there is to it and this Fatou's lemma is also corollary of monotone convergence theorem. In fact, the way you do it is from MCT, you can prove Fatou's lemma and from Fatou's lemma you can prove dominated convergence theorem. So, this dominated convergence theorem is practically very useful. So, and Fatou's lemma it is a little bit more technical. So, I will discuss it, but I will not hold you responsible for it, I will not consider it a part of your syllabus. A dominated convergence theorem is very useful and important. So, let us discuss Fatou's lemma first. So, before let me just motivate like this note that if you have if x comma y are random variables. So, we always have expectation of min of x y is less than or equal to min of expectation of x expectation of y. So, this is always true you see y it is true. So, this is actually an element. So, expectation of the minimum is less than or equal to minimum of the two expectations. So, the reason this is true is you can see it fairly easily. This is because minimum of x comma y is less than or equal to x and minimum of x comma y is less than or equal to is also less than or equal to y. So, if you take expectations you have. So, this is not equal to that. So, if I put expectation here I should get that and similarly for this if I put expectation of all that I should get. So, this is both these statements are true. So, expectation of min of x y must be less than or equal to the smaller of these two right both of these are true. So, this should be. So, if you take this is true also for n random variables right this is same argument exchange for n random variables you put x 1 through x n as random variables. The expectation of the minimum is less than or equal to minimum of the expectations always true for finite number of random variables this is true. See this Fatu Slema is in the same spirit except it considers an infinite it is considered a sequence of random variables it is not a finite collection of random variables x 1 through x n, but it considers a sequence of random variables. Now, if you have a sequence of random variables x 1 through x 1 x 2 dot dot dot there is no notion of minimum right. If you have infinite collection of numbers you cannot you cannot necessarily talk about the minimum right because the minimum may not be attained right. You have to talk about infimum right. So, that is what Fatu Slema does it is a very analogous result it is a very trivial result for the case of finite number of random variables, but Fatu Slema generalizes it to a sequence of random variables. So, Fatu Slema x n be a sequence of random variables such that x n is greater than or equal to y for all n and expectation of absolute y is this then infinity. So, I am just stating it for the case of random variables you can stated for functions also there is no big difference I will state both Fatu Slema and dominated convergence theorem for just random variables. Then expectation of lim n f x n is less than or equal to lim n f expectation of x n and if you replace x n by minus x n if you have x n is a sequence of random variables such that d dot d tau d tau d tau d tau x n less than or equal to y for all n and. So, there are expectation of y less than infinity then expectation of lim soup x n is greater than or equal to lim soup expectation of x n. So, this part is I mean this is not a separate statement this is say a statement number 2 is same as statement number 1 with x n replace with minus x n and y replace with minus y. So, if you prove this you would approve this after all lim n f of minus x n is same as lim minus lim soup of x n. So, recall what so do you remember what lim n f is before we go on and prove this. So, recall what is lim n f n tending to infinity of a n some sequence a n what is this defined as equal this is says this is limit n tending to infinity infimum m greater than or equal to n a m. This is the definition of lim n. So, I think we said this in the very first lecture. So, I think I gave a little bit of an overview of limits and all. So, this so you look at. So, you look at the sequence for m greater than or equal to n a n a n plus 1 and so on and you look at that infimum and then you send n to infinity. So, since you are looking at m greater than or equal to n you are looking at a n a n plus 1 and so on this you can show to be an increasing non decreasing sequence in n. So, this limit always exist. So, lim n f is always well defined the limit may not exist, but the lim n f always exist lim soup is defined the same thing same way with supremum. So, now what is lim n f n tending to infinity of x n for each omega you consider x n of omega and x n of omega will be a sequence of real numbers like that and lim n f of x n is simply for every omega you consider lim n f of x n of omega which is defined exactly like this. So, this you can show in fact to be a random variable lim n f of a sequence of random variable is always a random variable lim n f lim soup limit they are all random variables you can prove it. So, if you take the expectation of the lim n f random variable you always get something that is smaller than or equal to first taking the expectations and then taking the lim n f of that sequence first you are taking lim n f of the random variable sequence then taking expectations this is not equal to taking the expectation first and then taking lim n f. So, exactly like this except now you have this infinite sequence. So, you cannot talk about minimum you have to talk about infimum and. So, this limit and this is a limit infimum. So, it is the smallest limit point of the sequence. So, the statement clear with me the proof is as follows. So, if I prove 1 I would prove 2. So, I will let me just prove 1. So, fix n and then we have infimum k greater than or equal to n x k minus y I am going to consider that x k minus y is less than or equal to x m minus y for all m greater than or equal to n. So, this y see this y is a random variable with finite mean which has which is like a lower bound on all these x n's. So, often this for 2 slimmers applied for non negative random variables in which case you simply have y equal to 0 for example. So, if you have a sequence of non negative random variables x n is bigger than or equal to 0 and you can apply that is a particular case of for 2 slimmers that is often how it is applied or any lower bound. So, this y is acting like a lower bound y has finite mean. So, you have this statement. So, you are looking at this sequence for n n plus 1 and so on and you looking at the infimum. So, that must be smaller than x n minus y for any m greater than or equal to n. This is very clear from the definition of the infimum. So, you take expectations. So, you have expectation of infimum k greater than or equal to n x k minus y is less than or equal to expectation of x m minus y for all m greater than or equal to n. I am essentially doing this kind of a step I am doing expectation on both sides. So, this is true for all m greater than or equal to n. So, now I am going to make the equivalent of this step. So, I am going to take infimum over all m greater than or equal to n. See, this side there is no m correct this side there is an m and this is true for all m greater than or equal to n. So, if I put an infimum here. See, if I put an infimum over m here nothing will happen there is no m here I can put an infimum. So, I get. So, basically all that is less than or equal to infimum over m greater than or equal to n expectation of x m minus y. So, now I take. So, now I will send n to infinity. So, I am going to. So, this is. So, I have taken infimum here. So, now I am going to send n to infinity this guy to infinity. So, if I send n to infinity what happens to this side it will be what limit. So, it will become limit. So, this if you look at this box if I put limit n tending to infinity in m greater than or equal to n I will have. So, look at that whole thing as a m this will become limit of expectation of x n minus y correct. Now, so take limit n going to infinity. So, let me continue over there I have. So, I will have limit n tending to infinity expectation of inf k greater than or equal to n x k minus y is less than or equal to right hand side will become limit. Limit of expectation of x n minus y correct. See if you are getting bothered by this y think of y as 0. I mean mostly for 2 lemmas applied with y as a constant random variable in particular equal to 0. It does not have to be, but that is how usually it is applied. If you are bothered by y just forget this y. Assume that it does not exist I am going to write it, but pretend that it does not exist. If you are just because it is hard little bit of hard work keeping track of all these infimums. So, it at first site may be want ignore the y. So, here I have. So, on this side I have what I want if you forget this y for the moment I actually have what I want on the right hand side correct. On the left hand side I have something a little bit different. So, ideally what would I like to do. So, if this limit were to jump inside the expectation what would I have I will have what I want. So, if I would have established for 2 lemma if I can jump the limit inside the expectation. Now, is that allowed we know so far we know only one result under which that is allowed monotone convergence theorem. So, I have to see if let us say if you look at that whole thing as some z n. So, this is be index by n is not it this is index by n because I am taking infimum k greater than or equal to n. Now, this z n will be increasing or decreasing z n is non decreasing. First of all z n is non zero well non negative because x n is greater than or equal to y. So, this whole thing is non negative. So, infimum of a non negative sequence. So, z n is definitely non negative correct for all n and you are taking infimum k greater than or equal to n. So, if you take if you increase n by 1 you are taking infimum over a smaller set in which means it should be bigger is not it. So, z n and so and z n is non decreasing. So, note so this guy is z n and note z n is non negative and z n is non decreasing. So, that much is a very easy argument. Now, what is so and limit so z n is a non decreasing non negative sequence of random variables. So, limit n tending to infinity z n which will simply be limit of what x n minus y. So, limit n tending to infinity of z n will be limit of x n minus y correct by definition of limit. So, now so I have a sequence of random variables which monotonically converges to that random variable. So, this is some other random variable called this some z or something. So, by m c t I can take the limit inside. So, m c t gives the required result. So, m you have to apply m c t on the left side because z n is monotonic and it converges to some z. So, I can always take the limit inside and what you will have inside is z which is this guy right. So, I will have. So, I mean it will give the required result assuming y equal to 0 right. So, if y is not 0 you will have something like m c t gives let me write down it will have expectation of limit x n minus y it is not equal to limit expectation of x n minus y. Now, if this y were 0 no problem if y is some random variable which has a finite mean. Then I can invoke linearity of expectations right then expectation of that minus that will be you know it will be like expectation of this bit minus expectation of y and similarly, this expectation will go inside and that is where you use this fact right where expectation of y is something finite all right. So, you can take the expectation inside and cancel the expectation of y. But if this were not true then you cannot this may not be well defined right. So, you may have some infinity minus infinity form. So, if this were not true then it does not work. So, then linearity will help you prove it and the second part will follow if you apply the first part to the sequence minus x n and minus y. So, this is not a distinct statement it is like the same thing with x n replace with minus x n and y replace with minus y because limit of x n will be minus limb soup of minus x n right any questions. So, for 2 slimmer is the corollary of monotone convergence theorem see the only non trivial step see everywhere I have just been applying the definition of limit and the only non trivial step was in applying monotone in realizing that monotone convergence theorem can be used to take this limit inside right that was only non trivial step. So, one of the main use of for 2 slimmer is in proving dominated convergence theorem it may also be of independent use in some circumstances. But the dominated convergence theorem follows very easily from for 2 slimmer that is probably why for 2 slimmer is called a lemma probably it is the lemma for d c t right. You often wonder there are these lemmas in mathematics you often wonder what it is a lemma for right there are many of these fundamental results even borel cantile lemma right I do not I do not quite know why it is called a lemma it is probably because it helps to the strong law of large numbers it. So, that is why it is called a lemma I think it is dominated convergence theorem we will use for 2 slimmer as the lemma for it we will use for 2 slimmer to prove it. Consider a sequence x n of random variables I will state it for random variables you can stated for functions also. Consider a sequence of random variables such that x n converges to x almost surely which means on a set of probability 1 this happens. Suppose there exists a random variable y such that x n is less than or equal to y well absolute value of x n is less than or equal to y almost surely. And y has a finite expectation then limit n tending to infinity of x n is equal to expectation of x. So, you have a sequence of random variables which x n such that x n converges to x almost surely, but this convergence need not be monotonic it can be non monotonic also you can have x n bigger x n plus 1 smaller you can have oscillations no problem. But so, if that is all you had it is not true that you can interchange limit an expectation right you know that if x n converges takes almost surely it is not necessarily the case that the sequence of expectations will converge to expect that is not the case. So, here you have a situation where x n are dominated by another random variable y. So, absolute value of x n is less than or equal to y and y is a random variable with some finite mean finite expectation then you can interchange limit an expectation. So, limit of the expectation is equal to expectation of the limiting random variable. And dominated convergence theorem often you apply it for bounded random variables. So, if y often you if you can find some bound on that if the x n of for example, uniform random variables or something with which is bounded almost surely. So, you have a sequence of random variables converging almost surely to some other random variable, but these random variables are almost surely bounded above by some random variable y which has finite mean then you can interchange limit an expectation. So, let us prove this since minus y is less than or equal to x n is less than or equal to y for all n we can invoke both sides of Fatou's lemma. See one side of the side of the side I proved says x n is bigger than or equal to some random variable with the finite mean. The other side the supremum side says it is bounded above by some random variable. Now, since I have assumed that absolute x n is bounded above then I necessarily have that. So, I can invoke both sides of Fatou's lemma. So, what you do is the following it is actually very simple proof. So, you have expectation of x. So, expectation of x is equal to the expectation of limit of x n. Why is this true? See because limit of x n is see the almost sure limit of x n is x. Since, the almost sure limit of x n exists then limit for lim super equal. So, limit of x n is same as limit of x n. We know that the limit exists almost sure convergence happens. So, you have this. Now, so as soon as you see this what you do? Invoke Fatou. Fatou says this is less than or equal to limit expectation of x n does. So, limit expectation of x n always exists. So, this is well defined. Now, but lim is less than or equal to lim soup always. So, I have to bring the other side of Fatou next. So, I have to I will do this. This is less than or equal to lim soup. So, this is because of Fatou. This is because of Fatou's part 1 and then this is limit lim soup and tending to infinity. This is simply because of the fact that lim soup is less than or equal to lim. Now, you can invoke the other side of Fatou saying that this is less than or equal to expectation of lim soup of x n. This is because of Fatou number 2 part 2 of Fatou's lim. Now, what is that equal to? This is just expectation of x again. Because lim soup x n is equal to limit of x n almost your limit. And because there is expectation outside almost sure is I mean on a set of probability 0 nothing matters. So, this is equal to expectation of x. So, I only prove that expectation of x is equal to expectation of x. Actually, I have proved that expectation of x is less than or equal to expectation of x which I already knew. I have proved this is less than or equal to the same thing which is always true. So, why am I doing all this? Have I proved anything non-trivial here? See if you just look at this and this it looks like I have proved a tautology proved a very trivial statement. But I have proved something non-trivial. I have proved that all these inequalities must be equalities. If any of these. So, there are how many inequalities here 1 2 3 inequalities. If any of these inequalities were a strict inequality then I will get a contradiction saying expectation of x is strictly less than expectation of x that is not true. So, all these inequalities must be equalities. So, in particular what is the non-trivial inequality here? So, this equality this lim n phi equal to lim soup is of consequence which means the limit exists limit expectation of n x n exists. Also limit expectation of x n will be equal to yeah. So, this implies. So, all inequalities all inequalities above must be met with equality. Thus lim n phi of expectation of x n is equal to lim soup expectation of x n equal to expectation of x. And since lim n phi equal to lim soup limit of a expectation of x n equal to expectation of x. Therefore, this is actually I mean there is something non-trivial here it is not you just look at this and this you look it looks like you have not proven anything. But it actually proves that this and this must be equal which means the limit has to exist and the limit must be equal to expectation of x expectation of x. So, this this dominating random variable y just helps you to invoke for 2 slimmer from both directions that is the only job of this y. So, as long as you find some random variable y which dominates absolute x n and as long as. So, it should have finite mean of course, because then for 2 itself will not hold if this is not true then you can always interchange limit and expectation. And if you want to state d c t for functions what would you how would you state it if I want to state dominated convergence theorem for any functions and integrals. I will state it as let f n be a sequence of functions such that f n tends to f mu almost everywhere f n converges to f mu almost everywhere. And there exists some other function g for which absolute f n is dominated by g and integral g d mu is absolute g d mu is finite then limit n tending to infinity integral f n d mu is equal to integral f d mu any questions. So, corollary of dominated convergence theorem well rather the most common application of the dominated convergence theorem is when y is some constant m. So, if you have a sequence of random variables which are all dominated by a constant if they are all bounded random variables for example, then you can always interchange limit and integration some people call that corollary bounded convergence theorem. So, now you have at least 2 situations where you can interchange limited integration m c t and d c t. And if you think about it now that you have seen the proof d c t is simply a corollary of m c t I mean it is not really a distinct theorem it is stated as a separate theorem because it is very practically very useful. So, the m c t is all there is there is only one major theorem in convergence of integrals that is m c t right any questions no it is it is different right it is different because you need for 2 see for 2 slimmer need this kind of a situation right. So, you need x n to be. So, the for 2 slimmer part 1 said x n is greater than or equal to y part 2 said x n is less than or equal to some y right. So, without this you cannot invoke for 2 slimmer any other question. You cannot say that right because even in the example we gave. So, actually we gave this example if you remember right the function was equal to. So, if this is my omega. So, 0 1 was my sample space and my x n of omega was. So, in this case I think. So, this was n and 1 over n remember this example. So, in this case expectation of x n is always 1 right, but you cannot interchange expectation and limit. In this particular case the sequence of random variables x n expectation of x n is finite for all n, but the x n are themselves not dominated. Because they are getting bigger and bigger right there is no random variable y which dominates x n such that expectation of y is finite. Because if that were the case this I mean you will not have this as a counter example right then you will be able to interchange limit and expectation correct. So, this example neither monotonicity nor dominatedness holds if because either of them holds you will have convergence of integrals, convergence of expectation right. That example I will stop