 Let me fill up a few gaps in the last lecture. So, for example, this was, I had discussed this card game, bridge game. There was an example, number 2.7, which was talking about game of bridge, in which 26 cards have been dealt to people, to the players A and C. So, remember there were 4 players A, B, C, D. So, 26 cards have been dealt to the players A and C with 7 clubs. So, we wanted to compute the probability that player B gets 4 clubs. Now, I was trying to say that this can be, I was trying to show this as an example of computing a conditional probability, that is, already 7 clubs have been dealt to players A and C. So, what is the probability of getting, B getting 4 clubs. But now, I wanted to then show you that, of course, we solved out the problem. I told you that this is the probability, but then it was not possible to explain this through the conditional probability argument. So, therefore, I just want to tell you that there are other ways of computing this probability and that is by reducing the sample space. So, now, instead of taking the distribution of 52 cards to the 4 players, I am just reducing the sample space, because 26 cards are now left out of which 6 are clubs, because 7 have already been distributed to players A and C. So, then this is now your multinomial distribution. So, what you are saying is that out of 6 clubs, B should get 4 and then from the remaining 20 cards, he should get the other 9 cards. So, that way he gets 13 cards. So, this is the number of ways in which he can get 4 clubs out of 6. And 9 cards out of the remaining other 20 cards. So, this is the number of ways in which he can, this can happen and the total number of ways in which he can get 13 cards out of 26 is 26 choose 13. So, this is the actual probability and this we can get by reducing the sample space. So, this is another technique which is very helpful at times. You can very easily compute the required probability by doing this. So, you do not have to always go through the formula, at the actual formula. You can always play around with alternative ideas also. Now, remember I asked you to construct examples when we were talking of independence of 3 sets. And in that case, I showed you that there are 4 conditions which have to be satisfied before you can say that the 3 sets are independent and then I asked you to construct a counter example to show that suppose the first 3 are satisfied, the last 3 are satisfied. That means, pair wise if you choose any 2 sets from here, they are independent. They satisfy the condition of independence. That means, your probability of E intersection f is p e into p f, probability of E intersection g is p probability of E into probability of g. And similarly, probability of f intersection g is a product of the probabilities p f into p g. So, you can construct situations where let us say last 3 conditions are satisfied, but the first one is not satisfied. And so I have been able to lay my hands on this example here. Suppose, your sample space consists of these 4 numbers. So, when each number is equally likely, that means, the probability of any number coming up as a result of the experiment has probability 1 by 4 because they are equally likely and there are 4 elements in omega, your sample space. Now, let us choose E to be as consisting of the numbers 1 4, f to be consisting of the numbers 2 comma 4 and g 3 comma 4. In that case, you see if you see the probability of p e, p f and p g is 1 by 2 because they are 2 numbers totally 4. So, 2 by 4, again we are using the same concept because outcomes are all equally likely. Therefore, I am using the m by n version of the probability and so this is half for each of the sets. Then, probability E intersection f, now E and f have only 4 in common. So, that is one singleton and that probability is 1 by 4 which is the product of p e into p f. So, similarly you see that the pair wise independence has been, you can easily verify that the 3 sets are pair wise independent. But, when you write E intersection f into intersection g, that again results in single number 4 and so this probability is 1 by 4, but this is not equal to p e into p f into p j which is 1 by 8. Therefore, now you get an idea and you should try to construct one example of your own to show that a pair wise independence of 3 sets is not enough to say that the 3 sets are independent or 3 events, I mean the 3 events are independent, you should be able to construct. And then I said that you know extending this becomes little more cumbersome and so you will leave it here only. Now, exercise 2 of course, I have given you the corrected version. So, this is there and question 10, in exercise 2 I want to revisit because when I was reading it last time I said that this needs a little thought and so let us revisit the problem. So, actually you were given the data that a battery as a lifetime. So, there is a probability that the battery will last for more than or equal to 10,000 kilometers that probability is given to be 0.8 and the probability that it will last or equal to 20,000 kilometers the probability was given as 0.4 and then that it will last more than or equal to 20,000 kilometers also a probability is associated with it. And you were asked to find that if you have bought a battery which is already run for 10,000 kilometers, what is the probability that will be running for more than 20,000 kilometers. That means, its lifetime will be more than 20,000 kilometers if it has already run 10,000 kilometers. So, this is a simple case of conditional probability computing the probability that the lifetime. So, if I let L denote the lifetime of the battery, we are wanting to compute the probability conditional probability of the event that the lifetime is more than 20,000 kilometers given that it is already 10,000 that is it is already 10,000 kilometers old the battery it has run that much. And so by a formula now when you take the intersection of these two. So, obviously this is a subset of this and therefore, the intersection simply becomes L that means you are wanting the lifetime to be greater than or equal to 20,000 kilometer when it is already run for more than 10,000 kilometers. So, the intersection of these two would be this because you want the battery to run 10,000 kilometers and you want the battery to run 20,000 kilometers. The intersection would mean that you want the battery to run for more than 20,000 kilometers. And so this is divided by the probability of this event which is L greater than or equal to 10,000 and therefore, this is equal to 0.4 by 0.8 which is half. So, this is the so I just thought that if you have thought about the problem maybe you some of you have already done it, but then this is the right answer. We have talked of conditional probability, but I just thought that should formalize this some more in the sense that you see we are defining now given an event A then we are computing the conditional probability of events conditioned on F. So, to show that this is the function that you have defined here the conditional probability function satisfies the three axioms of probability and not too difficult to compute to show because first axiom requires that this for any event E the conditional probability with respect to F must be within 0 and 1 that is the first axiom. So, here if you take probability E condition on F then by definition this is this. Now, you see that E intersection F is a subset of F. This is a subset of F and therefore, we have already shown remember after giving the axioms we proved some propositions and there I showed that if a subset is a subset of another subset then the probability would be. So, that means, here probability E intersection F will be less than or equal to probability F because this is a smaller event in the sense that this is a subset of this therefore, probability of this is less than or equal to this as we could easily show. Then and by definition of course, probability E condition F is non negative this is non negative this is non negative. So, this is non negative. So, therefore, the first axiom is satisfied because the P intersection F is less than P F. So, I divide by this this is less than 1 and this is non negative. So, first axiom is satisfied. Now, second axiom said that your probability omega should be 1. So, here for us because we are conditioning on F. So, the corresponding axiom would be that probability I should have write sorry sorry this should be again same thing I have to say that omega condition on F. So, what will this be this and therefore, now when you take omega intersection F this will be probability F this is equal to 1. So, therefore, probability F divided by probability F is 1. So, probability of omega condition on F is 1. So, the second axiom is satisfied and the third one requires an E I intersection F divided by P F. Now, this I can write as see here this is this then because distributive law holds this will go as this can be written as union E I intersection F I belonging to I this thing. Now, because E I intersection F are disjoint since E I's are disjoint I have taken a sequence of disjoint events E I's here. So, E I intersection F are disjoint and so by and probability P the probability function the original probability function satisfies axiom 3 and therefore, this can be written as summation of probabilities E I intersection F divided by P F right which is equal to. So, this can be written as summation probability E I condition F I belonging to I. So, the third axiom is also satisfied therefore, the conditional probability function is also a probability function that is it satisfies the 3 axioms. Now, again I just want to continue working because conditional probability function is a important concept. So, again a little elaborated example that I would be 2 gamblers. So, this problem would be coming after I define for you the conditional probability. Now, I want to discuss the gamblers rho n problem which is the interesting example of conditional probability use of conditional probability. So, let us see the 2 gamblers A and B and they bet on the outcome of the toss of a coin right. Now, if on each toys if on each toss if h occurs that means, if the head shows then A collects 1 unit from B and if tail occurs that means, if T occurs then B collects 1 unit from A. So, this is all a simple game you toss a coin and then if the head shows A gets the money and if tail shows B gets the money from A. Now, tossing of the coin continues till one of them runs out of money. So, that is the gamblers rho n right. So, one of them will end up with all the money and the other one will have no money left. So, suppose total money is 15 units I am just taking a number 15 you will see that through the solution it does not matter whatever the amount of money that they start with and A has I units. So, that means, B will have 15 minus I units within. Now, tossing of the coin are independent events and we are assuming that the coin is biased. So, if a probability h is P and probability is 1 minus P we will also consider the case when P is half. Now, we want to find the probability that A ends up with all the money and we have assumed that starting at the starting point A had I units of money within. So, let E be the event that A ends up with all the money and let P I denote the probability because we are starting the game when A had I units. So, I am just denoting it by P I which is the probability of E ending up with all the money and A had I units to start off. Now, we have been using this technique very often because you are when you are tossing a coin your sample space is consisting of just h comma t two points in the sample space. So, you can write P E in terms of conditional probability P given h and that means now this we are saying that we are starting the game. So, the first toss of the coin either results in h or in t. So, this is conditional of E that means this is the probability of E this is the event that A ends up with all the money starting with I units when the first toss results in head. So, this is that event right. So, that into P H plus this is again the similar interpretation that A ends up with all the money when the first toss of the coin shows a tail. So, this is this into P T. Now, when the first toss of the coin gives you head then A collects one unit from B. So, therefore, the money at the end of this after the first toss A ends up with money I plus 1 and if the first toss shows a tail then A will have to give one unit of money to B and therefore, he will have I minus 1. So, therefore, if I as I said in the beginning I am denoting this by P I because A had I units of money when the game started. So, therefore, this equation can be rewritten as P I is equal to P I plus 1 because this will be then the probability that now A has I plus 1 units and you want to compute the probability that A ends up with all the money right. And then this into P probability of getting a head then here A will lose one unit of money to B. So, it will be P I minus 1 into 1 minus P. So, 1 minus P we can also write as Q now I can since P plus Q is 1 I can multiply this by P plus Q and write this equation in this way right and then rearranging the terms that means, see P I plus 1 minus P P I I bring here and P I minus Q I take to this side. So, I get this equation right and that gives me that P I plus 1 minus P I is equal to Q by P into P I minus P I minus 1. So, I get a recursive relationship between these P I's and boundary conditions are that if P had no money to obviously, the probability of his ending up with any money is 0 because he will not be able to play the game right. So, P 0 is 0 and P 15 because the moment he has 15 he has won the game and so the probability is 1. So, P 15 is 1 and P 0 is 0 right. So, therefore, now you we use this recursion and we start with I equal to 1. So, when I is equal to 1 this will give me P 2 minus P 1 is equal to Q upon P P 1 minus P 0. So, P 0 is 0 and therefore, I get this equation then when I put I equal to 2 in this recursion I will get P 3 minus P 2 equal to Q by P P 2 minus P 1, but P 2 minus P 1 from here is Q by P P 1. So, it will be Q by P whole square P 1 and so this way you can go on writing the differences. So, P 15 minus P 14 will therefore, become Q upon P raise to 14 because whatever the number here that is 1 less than this. So, the power of Q by P is 14 into P 1 right. Adding up P obtained. So, now I add up all these equations then you see these things will cancel out in pairs and you will be left with P 15 minus P 1, but P 15 is 1. So, this is 1 minus P 1 is equal to and on this side Q by P you can take outside. So, it will be 1 plus Q by P plus Q by P whole square and so on up to Q by P raise to 13 into P 1 right. And now this is a geometric series I can write down the sum. So, this will be 1 minus Q by P raise to 14 in upon 1 minus Q by P. Now, this is a finite geometric progression. So, therefore, it does not matter the only thing I need is that I can do this if Q by P is not equal to 1 right because otherwise I will be dividing by 0. So, if Q by P is 1 then P 15 minus P 1 you immediately get is 14 P 1 in the last equation you substitute Q by P as 1 everywhere then and so from here you will get P 15 minus 15 P 1 right because P 1 gets added here and therefore, P 1 is 1 by 15 and now you can easily compute your P 2, P 3 and P i by going backwards you can compute your this thing fine. So, we consider the case when Q by P is not 1 in that case what we got from the last slide is 1 minus P 1 is Q by P 1 minus Q by P raise to 14 upon 1 minus Q by P using the geometric progression series sum. And so if you take P 1 to this side and you get this and now you just simplify and after simplification you will get that 1 minus Q by P raise to 15 upon 1 minus Q by P into P 1 is 1. So, P 1 is 1 minus Q by P upon 1 minus Q by P raise to 15 fine. And so the required probability again from your recursion equations you will immediately get that required probability P i that means when A started with i units of money would be 1 minus Q by P raise to i upon this because that sum will be up to Q by P raise to i minus 1 and so you will get this. So, you substitute for P 1 so 1 minus 1 minus Q by P will cancel out and you will get this right. So, now if you want to find the probability that B ends up with all the money replace P by Q and i by that means right because now for B the P is Q and the money that B starts with this 15 minus i. So, when you make these two replacements you will again get the formula for computing the probability that B ends up with all the money right. Now, again continuing with some more results on conditional probability. So, this is proposition 2.5 what we are saying is that suppose you are given that conditional probability of A given B is 1 then you can show that conditional probability of B complement given A complement is 1. So, here I could have also given this as an exercise, but I just thought I will show you some more ways of doing it and then you can apply it to for example, I think you can show that conditional probability of A complement given B will also be you can compute or that will be independence fine right. So, right now just look at the definition. So, the conditional probability of A given B is given by this. Now, since this is equal to 1 it follows that probability A intersection B is P B. So, the moment you get this result you can see that a probability A union B which is written as P A plus P B minus P A intersection B this will because this thing is 0 P A intersection B minus P B. So, it reduces to P A. So, therefore, your probability A union B reduces to P A right and therefore, when you take the complement of A union B. So, probability of A union B complement will be 1 minus P A which is nothing but P probability of A complement and so and then again by your D Morgan's law we had seen that A union B component will be A complement I mean complement is A complement intersection B complement. So, this probability is therefore, probability of A complement right and hence your probability of B complement conditioned on A complement is this. Now, since these two things are the same therefore, this reduces to 1. So, one can go on and therefore, now the idea would be that you should get interested enough to try out many more results related to conditional probability solve more examples. Now, final result on conditional probabilities now this is conditional independence remember we defined independence of two events if the probability of the intersection of the two events is equal to the probability of the product of the individual probabilities. Now, is same thing gets extended. So, we say that E 1 and E 2 are said to be conditionally independent given that event f has occurred. So, if probability E 1 conditioned on E 2 intersection f is probability E 1 conditioned on f. That means, the occurrence of E 2 the same definition that we gave for independence of two events occurrence of E 2 has no bearing on this probability. So, that means, you when you condition it on E 2 intersection f it is the same as conditioning on f and E 2 has no role to play right. Now, in fact so we will just write it out in a. So, this is one definition, but you can come to a better result and this is we say that E 1. So, therefore, by definition E 1 condition E 2 intersection f this probability can be written in this way right. And this is given to be by the definition of E 1 E 2 being conditional independent conditional independent. This is probability E 1 conditioned on f right. So, therefore, from these two you get that this probability is equal to E 1 f and this probability of E 2 intersection f I can write as E 2 conditioned on f into P f right. So, this is the result and this one also again now I can condition on f. So, this will be probability E 1 intersection E 2 conditioned on f into P f and this. So, P f P f cancels out because remember whenever you talk of conditional probability with respect to an event then that probability has to be positive otherwise you cannot define it. So, of course, it is understood that P f is greater than 0. So, therefore, I can cancel out P f here and I will be left with probability of E 1 intersection E 2 conditioned on f is the product of the individual conditional probabilities that is probability E 1 conditioned on f into probability E 2 conditioned on f. So, this is you just extend the original definition of independence of two events to in the same way and. So, I will take up now an example a little elaborate example to show you the use of conditional independence. After defining conditional independence of two events with respect to the occurrence of another event I will now take up this example it may look a little complicated, but it shows good use of conditional concept of conditional independence. So, here again this example I have taken from Sheldon Ross. Surprisingly, this book this I will give you the reference later on has lot of new and innovative examples and therefore, I am using quite a few of them here in this course. So, now consider the situation when there are k plus 1 coins in a box right and you the probability of choosing the i th coin shows they are k plus 1 coins in the box and if I pick up the i th coin and toss it then the probability of it is showing a head i th coin shows a head is i by k and i varies from 0 1 to k. So, that means you can see that even i is 0 that means if you pick up the 0 th coin then the probability of it is showing a head is 0 which means both sides must be tails. Then if you choose the if you happen to choose the second coin the probability of it is showing a head would be 2 by k and similarly, the if you choose the k th coin if you choose k plus 1 th coin which has the number k then the probability of it is showing a head would be k by k which is 1. So, probably this particular coin the k plus 1 th coin is having head on both sides. So, whatever it is the situation is this now a coin is randomly selected from the box and is repeatedly tossed this should be repeatedly l y if the first. So, let me just make the correction here. So, it is repeatedly tossed if the first n tosses all result in heads what is the conditional probability that the n plus first toss will also result in a head. So, that means I pick up the coin at random from the coins which are there in the box then I repeatedly toss it and if the first n tosses have shown heads then I want to compute the probability that the n plus first toss will also result in a head. So, let us see we start computes finding out how to compute this probability. So, suppose C i is the event that the i th coin is initially selected and this could be any of the 0 1 2 k numbered coin. Then f n is the event that the first n tosses resulted in heads this is the and then h is the event that we want. So, n plus first toss results in head. So, I want to compute the conditional probability of h given f n given that f n has occurred. So, we have had n tosses and now I want to compute the probability that the n plus first toss will also be a head will show a head and this is the expression. So, I am going to derive it for you since 1 of the k plus 1 th coin will be selected. So, then f n can be written as f n union C i because at least one of the coins. So, this probability of C i union C i varying from the root k will be 1. So, f n can be written as f n union i varying from 0 to k C i and therefore, probability h conditional f n which is can be written as this. Then from f n I can write this expression union f n intersection C i and again by the distributive property of intersection and union I get that this can be written as probability of union i from 0 to k h intersection f n intersection C i. Because this is this is this and then this because you see all the C i's are mutually exclusive because one of the coins will get selected. So, therefore, these events become mutually exclusive and so the probability of the union can be written as some of the probabilities i varying from 0 to k h intersection f n intersection C i divided by p f n. Now, this one I can write in terms of conditional probability as h condition f n intersection C i into probability f n intersection C i divided by p f n. Then this remains this again I can write in terms of conditional probability C i condition f n into p f n. So, p f n p f n cancels because f n is a given event. So, therefore, probability f n cannot be 0 and so I get this expression which is written here. Now, it is reasonable to assume that repeated tosses of the ith coin are conditionally independent. This is where I am using the concept that means repeated to assume that the repeated tosses of the ith coin are conditionally independent respect to f n. That means, see I am considering the case when a coin was picked up randomly then it resulted in n tosses showing heads and now when I toss further. So, then they will be independent of conditionally independent of f n which means that probability h condition f n intersection C i is actually probability h condition C i only. So, f n has no real role to play here. This is what our definition of remember we said that e 1 given e 2 intersection f if I wanted to say that e 1 and e 2 are conditionally independent respect to f given that f has occurred. Then this is probability e 1 sorry f that means e 1 and e 2 are conditionally independent when f has occurred. So, then e 2 has no role to play on the occurrence of conditional happening of e 1 given f on the. So, the probability would be independent of the event e 2. So, the same thing here we are saying that here f n because it is conditional on f n. So, this probability is the f n has no role to play here and so probability h given C i. So, this is what we are assuming here and therefore, each of this probability is i by k. So, now that means I can now apply this in the formula here and this is i by k. So, that means you are getting because you have got n heads have shown up and probability of each head is i by k and I am assuming that the tosses are conditionally independent. So, therefore, this is i by k raise to n and the n plus first toss gives you a head will be 1 upon k plus 1 because there are k plus 1 points there and so sorry. So, this probability is what am I writing here yeah C i f n. So, h given f n. So, let me just check out here this is C i f n right. So, a probability of picking up the ith coin. So, again if you just pick up the ith coin that probability should be 1 upon k plus 1 because any of these coins are equally independent a coin is randomly selected. So, that means any of the coins is equally likely when you pick up from the box. So, therefore, the probability of picking up a coin is i by k plus 1. So, therefore, this becomes this and similarly here yeah. So, actually what is happening is that C i f n yes I missed out on this part. So, probability C i given f n I have rewritten as this. So, this is probability f n condition C i into P C i and then this is sigma j varying from 0 to k f n. So, the probability f n I am writing in this way f n condition C j into P C j. So, this is where. So, now probability f n given C i as since the things are conditionally independent the probability of picking up a head remains the same. So, when you want to pick up n heads this will be i by k raise to n and probability C i will be 1 upon k plus 1 and then here similarly this is summation. So, j varying from 0 to k j by k raise to n and 1 upon k plus 1 because now here your j is varying this corresponded to the ith coin that you have chosen. So, this is the thing and. So, it is just a computation I wanted to show illustrate maybe you can say that it is a engineered problem or whatever it is, but somehow you could use make use of the concept of conditional independence and arrive at this result and again through methods of calculus you can actually show that if k is large then this probability is approximately equal to n plus 1 upon n plus 2. So, therefore get simplified when you have a large number of points in the box, but otherwise yeah. So, this were continuously you broke up the. So, here in this expression yes C i's conditional f n this also I had to rewrite in this decomposed form and then apply the probabilities to get this expression. So, that was missing here yeah fine. So, this is an example to and often there will be situations when you would coming across the concept of conditional independence. Let me discuss exercise 2 with you again I will just try to give you brief hints. So, anyway probability the question 1 says that you have to compute the probability that only exactly 1 of the events e and f occurs and that is equal to probability e plus probability f minus 2 and of course, when we say minus 2 probability e f that means e intersection f. So, that notation is also acceptable you do not write the sign you know the intersection sign you simply say e f. So, that is what it means right. Now, if e f and g are 3 events then you have to find expression. So, again I am just wanting you to be familiar with the how you write express events in terms of your complements union and intersection. So, here I want you to write find expressions for the events. So, that of the 3 events e f and g only e occurs. So, if you want to write this then in 2 you have to write exactly 2 of them occur right exactly 2 of them occur. So, the third one should not occur. So, you can imagine that you will have to use unions and complements right. Now, in question 3 this is actually very simple the slash sign the condition sign is sort of dim, but anyway. So, this says that if probability a is greater than 0 then show that probability of a intersection b condition on a is greater than or equal to probability a intersection b condition on a union b. So, that is very straight forward actually and now you have to show that p of a intersection the conditional probability of a intersection b given a is greater than or equal to probability of a intersection conditional probability of a intersection b given that a union b has occurred. So, it is very simple because you see a is a subset of a union b and as we have already discussed that probability of a union b will be greater than or equal to probability of a right. And in the left hand side when you compute the conditional probability you will have a in the denominator in the denominator the numerator is probability a intersection b because a intersection b intersection a is again a intersection b. So, this is what actually you have to figure out and similarly on the right hand side the numerator is the same, but denominator would be probability of a union b and since probability a union b is bigger or equal to probability a you have the required inequality. So, I just gave it to you to be able to just you know figure out these things and therefore, then you can answer the question. So, if you write out the expressions you can immediately give the answer to this question. Now, problem 4 is from Sheldon Ross actually it should say Sheldon Ross or Ross Sheldon. In answering a question on a multiple choice test you know where you have more than one choice and you have to take the right one a student either knows the answer or she guesses. Let P be the probability that she guesses assume that a student who guesses at the answer will be correct with probability 1 by m where m is the number of multiple choices right because if she is guessing she does not know. So, each any answer out of the m choices any one of them is equally likely. So, the probability is 1 by m what is the conditional probability that a student knew the answer to a question given that she answered it correctly and I have given the answer here. So, know what are we saying what is the conditional. So, please enter probability what is the conditional probability that a student knew the answer to a question given that she answered it correctly. So, use the concept of conditional probability and then you should be able to do it right and now again this problem is from Sheldon Ross. At a certain stage of a criminal investigation the inspector in charge is 60 percent convinced of the guilt of a certain suspect. So, his conviction is that the 60 percent that means 0.6 is the probability of the person being guilty. Now, suppose that a new piece of evidence shows that the criminal has a certain characteristic such as left handedness, boldness, brown hair etcetera. So, it turns out that through some eye witness who may have seen the criminal some doing the act. The eye witness can only say that the person was either left handed one of the conditions characteristics is owned by the criminal. So, is uncovered. So, through some facts it is found out that whoever committed the crime has one of the characteristics right. If 20 percent of the population possesses this characteristic that means whoever in a town crime has taken place and so what they are saying is that 20 percent of the population possesses this characteristic. How certain of the guilt of the suspect should the inspector now be should again instead of how it should be now be if it turns out again S if it turns out that the suspect is among this group. So, now this is the situation for computing the base probability because you see the first the initially the inspector is 60 percent convinced of the guilt of a certain person. Now, it has been known that the criminal possessed some characteristic which it turns out that this suspect has that characteristic. So, therefore the probability of the suspect being a criminal would go up and so the posterior probability after knowing that the criminal possesses the characteristic will go up. So, therefore you want to I want you to compute the posterior probability here. Problem 6 says a parallel system functions whenever at least one of the components work. Consider a parallel system of n components and suppose that each component independently works with population 1 by 2. Then the conditional probability that component work 1 works given that the system is functioning. So, here you have to use concept of independence and conditional probability. So, problem 7 says that you have to either prove or give counter examples to the following statements which are self explanatory you should be able to either show that the statement is valid otherwise you construct examples to show that it is not. Now, problem 8 we I am asking you to show that if e f and g are independent then you have to show that e is independent of f union g. See remember now here I am using the definition of 3 events being independent. So, you have these 4 conditions that will be satisfied and then you can easily show that e is independent of f union g. In fact, you can whatever subsets you find by operation of you know taking intersection union or complement and then taking operations on those you can show that e will be independent of any of that any of such event which is founded by which is obtained by doing the operations of intersection complement and so on from f and g. So, this is what we are saying here. Now, store a, b and c have 50, 75 and 100 employees and respectively 50, 60 and 70 percent of these are women. So, that means store a has 50 employees and of which 50 percent are women. So, you can immediately say that 25 are women similarly 75. So, 60 percent of the employees in store b are women and then 70 percent of the employees in store c are women. Resignations are equally likely among all employees regardless of sex. One employee resigns and this is a woman what is the probability that she works in store c. So, now here again I am asking you to use base formula to compute the probability. So, one employee resigns that is given and this is a woman. So, this is also given that means a woman employee resigns you have to find the probability that she works in store c. 10, the probability that a new car battery functions for over 10,000 kilometers is 0.8. The probability that it functions for over 20,000 kilometers is 0.4 and the probability that it functions over 30,000 kilometers is 0.1. So, these are all conditional probabilities. If a new car battery is still working after 10,000 kilometers what is the probability that it is total life will exceed 20,000 kilometers and then it is additional life will exceed 20,000 kilometers. So, problem 10 we will have to answer may be we will leave out problem 10 from here and we will revisit it later on, but problem 11 you can answer easily. Suppose that a person chooses a letter at random from reserve. So, instead of chosen it will suppose that a person chooses a letter at random from reserve that means it can be any of the letters R, E, S, V and then chooses one at random from vertical. What is the probability that the same letter is chosen? So, this of course is your earlier from counting the probability in the number of combinations that are favorable to this thing that means see the two letters that are common between these two words are R and E that is it. So, you have to now find out the number of ways in which R will get selected from both both or E will get selected. And you can see that for example in the first word reserve R appears twice out of you know R E S E R V E and R in vertical appears only once. So, you can accordingly find out the probabilities and then find out that and since the operation of choosing a letter from reserve and from vertical R independent events the required probability would be the product of these two.