 Now, next I want to talk about our important law and probability called total probability law. Just to understand total probability law, let us take two events and let us say there is one set F and another set E and this is F. What I will do is I know that omega can be written as F union F complement, F and F complement is the entire thing. So, now what I am going to do is what portion of E lies in F that is E intersection F and what portion of E lies in F complement that is E intersection F complement and that is exactly E right. All of you agree? Now, if you look into that this portion and this portion are they mutually exclusive right. So, what I have basically done is in E I have looked into that portion that overlaps with E and the remaining part outside and this portion. So, this portion of this sorry this portion of this lies with F and the vertical portion lies with F complement and we know that this vertical portion this horizontal portion they are mutually exclusive this region. Now, if it is the case now I can if I want to compute the probability of these things I can apply the third axiom of probability which said that if two events are mutually exclusive the probability they are union is nothing, but some of their probabilities. So, I will do that probability of E is probability of this part and probability of this part and now I will apply the conditional probability on this by definition probability of E intersection F is nothing, but probability of E given F into F and probability of F given F complement into P of F complement. This I have done it for two things two events, but assuming that my like this I have assumed like here the conditioning happening on F and F complement, but I can go and generalize this. Suppose let us say I have F 1, F 2 and F n which are mutually exclusive and one notion I have also missed is like and these are also like partition like union of F as equals to O ok I equals to 1 to n. I have this partition then for any E you will be able to write it like this and this is called total probability law. What total probability law is saying you is you can compute the unconditional probability of E based on the conditional probability of E on this F i's which forms a partition after multiplying the probability of those partitions. So, pictorially that is what it is saying I mean what I showed it for two simply F here I have now taken this region F 1, F 2, F 3 is this F 4 and F i and now this there is some event in this gray shaded area E and I want to if I want to compute this probability of E, E could happen because of this portion that is falling inside F 1 or E could have happened because of this portion that is falling in F 4 or like maybe this portion which is falling with an F 2. So, I am just taking all of them and adding here. If you recall I said that event E happens when some one of the possibilities in the event E happens right and it can happen whatever that happen can fall in any of the partitions ok. Now using this we have now one fundamental formula in probability called base formula and now let us say suppose E has happened and now I want to now E has happened now I want to let us say now I know that let that event E has happened now I am interested whether E happened because of the things that are happening in the F 4 region or F 1 region or F 3 region like that I can ask the question and that is what we are trying to now see. If event E has happened what is the probability that it is coming from the F j part. Now there we can again go and apply our conditional probability formula and conditional probability again P E intersection F j again numerator is again I am applying the conditional probability. Now I am basically conditioning on F j and denominator P of E is coming from the total probability. If you notice that on the right hand side everything is like conditional on F j, E is conditioned on F j and now we are on the left hand side it is conditioned on E. So, then what is that if and like this now what is the probability let us say this is E and this is F and now you want to compute probability of F ok. Now what is this just going to apply yours E intersection F divided by P of F what is P intersection F in this case null set. So, it is going to be 0 because now if I tell you that F has happened can E happen no right because E is not over wrapping with F at all. So, if I know that F has happened there is no way E can happen. So, it probability is 0 if they are disjoint if you conditioned on 1 you are going to get a 0 value ok. And now I mean here is one simple example I will just read and give it to you for verification. Suppose let us say I mean this example came when we are doing all the things on online when we are all fighting the corona thing. Suppose let us say there are three symptoms that are possible one is like you will get a mild fever and another is like you get a body ache and you will get a high fever these are the and like cold and cough and therefore, possible symptoms mild fever body ache high fever and cold and cough. And there is some probability of all of them happening and we were seeing all these symptoms right when during the peak of the pandemic and all of them we wanted to see that like and we wanted to see if somebody is really positive it could be because of which symptoms or what kind of symptoms you will have. Now suppose each of this happening is like a probability of F1 is 0.2 probability of F2 is 0.1 like that. Now you want to see that if somebody is having a mild fever what is the probability that you would have you will have an infection ok. So, to do this ok now let us say we know that like if somebody has a mild fever then probability that he will be infected is 0.5. Somebody has a mild sorry what is a body ache then the probability that he is going to have infected is 0.2 like this. And now let us say we want to now can ask the reverse question somebody is event is like is already tested positive what is the probability that if somebody is tested positive because of he having one of the symptoms ok. And now you can see that you can go and apply this Bayes formula there and Bayes formula has many many applications. So, usually what we call this as like a kind of prior information like often we say that we know something and now after the actual event let us say here event is the actual event that is happened. So, we know that event e happening if something has happened like F5 we have some information. Now that event F itself has happened we want to compute what is the probability that it came from event F5 and this is sometimes called like a posterior. So, now you are trying to find out posterior probability using our prior information ok. So, fine now let me see if any of you have any questions ok if not let me move to the next one. So, we just went through the following things right sample events axioms of probability conditional probability independence of probability and Bayes formulae. Now what we will talk about is random variables discrete and continuous random variables and some of the functions associated with them ok called CDFs and PDFs and where is the random variable coming into picture. See in most experiments we would be interested in some function of the outcomes and not the outcome itself right. Why? Because when you are going to throw a coin what you are going to see is heads and tails, but you are may not be interested in head cell trace what you may be interested in number of the times head comes or number of the time tail comes or number of trials you have to throw till head comes you will be interested in such questions that are related to numbers. Now that is where random variable comes into picture we want to map the outcomes to numbers ok instead of like outcome could be like like head tails and all I am I want to get rid of that and everything I want to convert to numbers so that I can do some mathematical operations on numbers I can crunch numbers, but not heads tails and some characters right. So, let us look into some examples like tossing two coins we may be interested in the number of heads appeared and in this case if you are interested in heads number of heads and I am interested in at least one head even you throw two times both of them are fine in both the like here head hand and come here first head and come and tail has come later in the second toss, but in both of the one head has happened so they are fine with me. So, that is what like actual outcome is not of consequence to me actual outcome, but what was my interest is whether one head has happened that number one is of importance to me. Now if you look into the rolling of a dice you may be interested in outcome being 6 outcome being 6 could happen because first one showed phase 1 and the second one showed 5 or reverse or it could be like a 2, 4 or again that get reverse and 3, 3 any of them will lead you to number 6, but I am not interested in specific outcome what I am interested in my quantity of interest whether the sum is 6 and for me in that case all of them are equivalent or same and other example could be like suppose let us say you are grading in the class like and the policy we use is this grades like we will not assign you the exact value number you score in the class what we will put you in the brackets of a, a, b, b, b like that and a could be like anyone like who got 90 could be a and anybody who got let us say 75 to 90 bracket is like a, b like in that case what matters to us is like only this ultimately this numbers 10, 9, 8 like not the exact number between 0 to 100 you got ok. So we are trying this is basically random variables like a function which is mapping the outcomes so in in the class like outcome is the values you score, but we are now mapping it to some scale value between 0 to 10 instead of taking 0 to 100 ok that is where we are now getting into the random variables and it is like random variable is actually kind of misnomer for some reason this variable has to it is actually function random function likewise that actually random variable x is a function from your sample space to real number ok it is going to assign a real number to every possible outcome ok. So this is a simple definition so you can treat any random variable is something which assigns values to my outcomes and this definition holds when the things are discrete, but when the things are continuous one has to be little more formal and we have to need to worry about the inverse maps are and all, but we will not get into that ok. For our purposes we will just take it like maybe this has been discussed in IE 6 to 1 like is the formal definition of the random variable discussed there about the inverse maps and all ok. So fine for time being just take that random variable is something which is a map from samples space to real numbers ok. Now let us go back to our example and see what is the random variable we can define. So let us take a coin and I throw it repeatedly and I am interested and I will stop when the first head comes just think that you want to try for something and you keep on attempting till you succeed and when you succeed you stop. Now till you stop what are the what all the things can happen or what are the possible outcomes in this experiment you may succeed in one attempt then that case head has come first you may succeed in the second attempt in which first is failure second one or you may succeed in the third after two failures like that it can go on, but here if you look this is actually my sample space like h, t, h, t, t, h, t, t, t, t, h, t, t, t, t, t, h like that it can go on, but I am not interested in h, t, t sequence what matters to me is how many attempts I took to succeed one attempt, two attempts, three attempts and h and here my random function omega what it is doing is for the outcome h it is assigning number one for t, h it is assigning number two and for t, t, h it is assigning number three like this ok. So any question on random variable here so far ok let me ask this question so in this fair of two dice right where we have 1, 1, 1, 2, up to 1, 6 all the way from 6, 1 up to 6, 6 I am going to define my random variable x on this sample x as sum of outcomes I am not looking into what each phase is showing I am interested in their sum. So, what is the possible values x can take? It can take all the way from x to 2 to 12. So, x can take value from 2 to 12 like this and now if I ask what is the probability that x equals to 3 how can you compute what is the probability that my random variable can take value 3. So, 3 can happen because of 1, 1 sorry 1, 2, 2, 1 only these are the possible outcomes that would have given me x equals to 3 and I can then based on this I can go and ask the probability x is 3 and I can compute from this possible outcomes ok.