 Now, ok conditional probability mass functions this is nothing, but the joint probabilities we are, but earlier we were interested in joint, but now you want to condition on one happening over the other. We already talked about this conditioning on the events right. What was that? How did you define if we have E and F are 2 events and I am trying to find what is the conditional probability of F given that F has occurred. How did you define it? This is nothing, but divided by F right, but now this notion can be extended to random variables. Ultimately if I am going to say that X equals to some small some number X1 this is corresponding to some event right. So, now asking instead of specific events I can ask those questions in terms of the random variables ok. We already have defined joint probability mass functions. Now, we want to extend this to the conditional probability mass functions. Suppose I want to know what is the probability that the second random variable is going to take value X2 given that the first random variable has taken the value X1. This is the notation for this, but exact definition is this and this is what it represents that is joint probability of X1 taking value small X1 and the second random variable taking X2 given divided by probability that X1 is going to take the value X1. Now, so notice that now focus on the notations like every time we may simply write like this. Here it is kind of giving already by this notation we mean that we are asking for the question that my second random variable this the one in the numerator this portion corresponds to this and X1 corresponds to this random variable and here this vertical this is like this is a notation to say given it is not like a division in this ok. Now, earlier we know we we argued that if let us say P is a some discrete random variable let us say X1 and X2 has some probability mass function and we said X1 and X2 takes value X2 and we if you add it over all possible values of X1 let us say this X1 here and X2 here. What this value is going to be? It will be 1, but now suppose I am interested in the some conditional probability like fix some X1 and now I am looking into this probability that X2 is going to take X2 given X1 has already happened X1. Now, this is fixed and now I am looking into the possible values the X2 can take and now I want to take the summation of this over X2. What this value can be? This one no just to interpret what is this? This is I am asking the question I am already telling that there are two random variables right. Let us take a for simplicity X1 is one coin X2 is another coin they may be dependent right now I am not saying anything dependent or independent. The first coin I said it has already taken head or let us say it is 1 X1 is 1 I have told. Now X2 I am asking now I am asking X2 after my first coin has taken value 1 X2 can take still head or tail right and now I am asking the sum over those values it has to add up to 1 right like after the first one has shown head the second one either has to take head or tail one of these possibilities have to there is no other way and there is some has to be one. So, this is again going to be one actually and that is why this conditional probabilities are themselves probability mass functions ok. And similarly you can define conditional once conditional probabilities are themselves are probability mass functions and you just treat it as a simple probability mass function on a given condition and for that you can define your CDF right. In this fashion now here where X1 is fixed and now you are looking for the probability that your X2 is going to take value less than or equals to X2. So, I am just adding all the possible values of X2 till the value X and once I know this conditional probabilities I can talk about conditional expectation now ok. And this is a conditional expectation of X2 given X1 has taken value X1 all I need to do is now X1 has already happened X1. So, I do not need to worry about that X1 is fixed I now need to worry about all possible values of X2. So, I am taking all possible values of X2 here maybe I should have written this X2 here and multiply it with probability conditional probability and when you sum it over all possible X2 values you will get the expected conditional expected value. Is this clear? The expectation those who could not follow like go back and refer to these slides ok. Now, let us quickly do an example to see that how much you could understood this table is the same table which we have earlier used for the probability mass function right. So, there you try to compute the marginals, but now we are going to compute conditional probabilities. So, now let us compute the probability that X1 has taken the value 1 what is the probability that X2 will take value 4. So, this one is very simple go back and use this formula ok. Let us compute what is the probability that X2 takes value 4 and X1 takes value 1 4 and 1 ok. So, if I have to use this formula the numerator I got is 0.05. Now, what is the probability that X1 equals to 1? This is the sum of this row right and what is that value? This is going to be 1 by 7 right ok. Now, I want to compute the expectation of X2 given that X1 equals to 1. Now for that what you need to do? You need to compute the probability mass function of X2 given that X1 equals to 1. One value already computed here. You are computed what is the value that X2 takes value 4 given X1 equals to 1. Now you need to find out what is that it takes value 2 and 5 ok. Can somebody tell me what is 2 by 2 given 1? So, the denominator remains same right. What is going to change is this is going to be 0.1. So, this is like 1 by 35 ok 10 by 35 and similarly what is that PX2 X1 given 5 by 5 given 1 20 by 35 right. So, now, you got all the three all the possible values of X2. So, first thing you need to now verify is ok let me what we get is now probability that I am putting a dot here X1 equals to X1. What are the possible value X2 is taking? It is taking three values right 2, 4, 5 and now we just computed its probability mass function right. When X2 ok maybe write me 2 X2. When X2 equals to 2 it was how much? It was 10 by 35 and when it was 4 how much it was? 1 by 7 or let me write this as 5 by 35 and when X equals to 5 X2 equals to 5 it was 20 by 35. Is this a probability mass function? The conditional because it is adding up to 1 right. So, always if it is a conditional probability is better add up to 1. Now find the expectation now it is easy right. You have all the things now this is nothing but X1 equals to X you know this is going to be 1. Now all you need to do is 2 into 10 by 35 plus 5 into sorry 4 into 5 by 35 plus 5 into 20 by 35 and how much is this 20, 40, 100, 140 by 35 am I correct? That is it that is the expected value. Now as a quick computation this is expectation of X2 given that X1 equals to 1. Now can you compute what is the expectation of X2 just what is the expectation of X2? This is unconditional expectation right. What is the value of this? This is nothing but X2 into P of X2 is taking value X2 over X2 and you can compute this marginal from this conditional that joint probabilities you will you can compute this marginal right and once you have you can compute this. Can somebody quickly compute this and tell me whether you get the same value as this are you getting same value sure. It is just 4 ok someone can someone verify this for me this is 4 and leave it why you want to calculate just enough no fraction I want this value. We are getting this as 3.7 this is like a 37 by 100. Did anybody get 3.7 and this is 2 right that no this is not 2 this is 4. So, conditional expectation is 4 and unconditional expectation is 3.7 ok. So, in that way conditional probabilities and unconditional probability need not be the same ok. Similarly, you can compute it for others I am just skipping it ok. Once we do it for PMF same thing can be done for probability density function also. Let us say you have this joint continuous probability density function then we want to know the PDF of X2 given that X1 has taken some value and by definition the conditional probability is defined like this that is again the joint PDF divided by the marginal PDF on which you want to condition ok. And similarly if you want to compute the expectation you do the same thing you take the expectation with respect to the conditional PDF ok. And here again as an example I have put the same example that I have used earlier where we try to compute marginal PDF from their joint PDF. Now can somebody quickly compute me oh by the way I did do somebody remember what was the value of C earlier 4 by 9 4 by 19 4 by 29 4 by 19 ok. Let us take for time being C equals to 4 by 19. Now what is this value of of X2 X1 given that your X2 and X1 value is 2.5. How to compute this? For this you need to find F of we want to take X2 equals to X2 and the X1 is 0.25 divided by F of X at 0.25 this is the value and notice that both in this case and this case and this case the value on which you want to condition right that this probability need to be positive otherwise you will end up with divided by 0 condition ok. If you are trying to condition on some value X1 it is better be some value in which there is some positive mass right. If there is no positive mass that means value 0 means you why you you want to condition on something which is not going to occur ok. So, whenever you want to do condition make sure that you condition on something which has a nonzero mass and the same thing when you want to condition in the continuous case on some point make sure that that point has a positive positive value the PDF is positive at that point. Like similarly again you can compute the expectation here by finding the like first you need to find the conditional PDF and based on that you can find out this conditional expectation ok.