 So, after having defined the axioms of probability and I try to show you how probability function can be obtained, let me now derive a few propositions using these axioms. So, the topic that I am going to talk about is simple propositions. So, proving this proposition that P phi is 0. So, in axiom 3 we have taken we take E 1 to be omega and then E i's are all remaining E i's are empty and here we will take the index set to be a finite index set. So, it does not matter I have to prove it for I have to just show that P phi is 0. So, whether I take. So, here it is convenient to take i to be a finite index set. So, I take i to be a finite index set then all other E i's are empty sets. So, therefore, the condition for axiom 3 is satisfied because certainly E 1 is disjoint with an empty set and all other sets are also because they are all empty. So, they are disjoint. So, therefore, I write down applying the axiom 3 I say that P omega plus P of phi and phi gets added i minus 1 times because 1 has gone here. So, 1 less than the index of i cardinality of i. So, therefore, this is P phi into cardinality of i minus 1. This has to be equal to 1 because this is P omega. Since, you add up all the E i's then this all they all add up to omega. The remaining sets are all empty sets. So, this is by axiom 2 this is 1 and so and therefore, since i is a finite number which is greater than or equal to 2 because we are saying that there is at least one set which is equal to omega and the other there is at least one set which is the empty set. So, therefore, the number of the cardinality of i has to be 2 or more than 2 and so this number is not 0. Therefore, here you see because P omega P omega cancels out then you get that also P omega is 1. So, anyway this cancels out and you are left with this equal to 0 which means that either this is 0 or this is 0, but then this number is not 0 since cardinality of i is 2 or more than 2. So, P phi must be 0. So, this is how we obtain the proposition that P phi will always be 0. Now, proposition 2 is this actually that probability of the complement of event E is 1 minus probability of the event E and this can be very simply derived and see here the attempt is to show you that once you define the axioms and then using the axioms logically you can derive at so many results and build up a good structure. So, that then you can estimate probabilities of more complex events and so on. So, now for any event E union E complement is omega either a element of omega is in E or is in E complement. So, this holds then since E and E complement at this joint sets they are mutually this thing exclusive from axiom 2 and 3 it follows that P E plus P E c is 1. So, from axiom 3 it says that probability of E into probability of E union E complement is P E plus probability of E complement and then axiom 2 says that probability of omega is 1. So, this is what holds and therefore, from here it immediately follows that probability of E complement is 1 minus P E. So, that is your second proposition. So, therefore, now using the basic 3 axioms I have been able to arrive at the result that P phi is 0 and then now I have shown you that P of the probability of the complement of an event is 1 minus the probability of that event. Proposition 2.3 says that now and you can see that the results are getting more and more complex. So, if there are 2 events E and f subsets of omega then probability E union f is equal to probability E plus probability f minus probability in intersection f and you see that in case E intersection f is empty then this will be 0 because we have just derived the result that P phi is 0. So, in that case P E union f would be this and this again is valid from axiom 3 because in case E intersection f is empty then E and f are disjoint and then we said that probability of the union is the sum of the probability. So, this is in compliance with your axioms and so on. Now, let us start proving this result. So, I will first say that E union f is E union E complement intersection f because this set E union f. So, either whatever the elements of f are in E that gets covered here and the remaining elements of f are not in E. So, they are in E complement and therefore, you can write E union f as E union E complement intersection f. Now, by doing this you see I have broken up this union into the union of 2 disjoint sets because E and intersection with E complement intersection f is empty. See, there cannot be any element which is in both E and E complement. So, therefore, this is empty. So, now axiom 3 again says because these two sets are disjoint. Therefore, probability of this union. So, probability of E union f is equal to probability E plus probability E complement intersection f because I could decompose this union into the union of 2 disjoint sets and so I can immediately apply axiom 3 and I get this. Now, we will try to write again decompose f in a way so that I can make use of the axiom 3. So, now f is f intersection omega, but then omega as we said we can write as E union E complement I just used it here. So, then f intersection of E union E complement which I can write as f intersection E union f intersection E complement this is by the distributive law which I defined some time ago for you. So, therefore, in the earlier lecture. So, therefore, this is equal to this. Now, here again you see that these two sets are or it should be union. You have to be careful because there is no meaning of putting a plus sign here these are subsets fine. So, this is equal to this. Now, here again both the sets are disjoint. So, therefore, when I write probability f I can simply write it as probability E intersection f plus probability E complement intersection f by using our axiom 3. Now, from here from this relationship I can now compute probability E complement intersection f as equal to probability f minus probability E intersection f and this I will then substitute in 1. So, I will get P E plus P f minus probability P intersection f. So, substituting this in 1 we obtain the desired result that probability for any two sets probability of the union of those two sets is some of the probabilities of the two sets minus the probability of the intersection of the two sets. And there is a very intuitive argument for this for the validity of this result. Now, since you see E intersection f is a subset of E and f both E intersection f these are the common elements. So, they occur in E as well as in f and therefore, when you are adding up the probabilities P E plus P f probability of E intersection f gets added twice. So, I have to subtracted once to make the equation balanced and therefore, this is the result. And even through Venn diagram you can explain that when you see this is A this is B. So, probability A is this area probability B is this whole area when you put them together you have added this area twice. Because this is your A intersection B you have added this area twice and therefore, you need to subtracted once. So, that the two equations are balanced. So, immediate corollary of this result that P E union f is P E plus P f minus P E intersection f is that if A is a subset of E then the probability of A is less than or equal to probability of E. Now, we can write E as because A is a subset of E therefore, I can write E as this the subset E I mean the set E I can write as A union A complement intersection E. Because if this is E and this is subset of A. So, then this is A and this portion is your A complement intersection E. So, therefore, and then these two will be disjoint sets. So, therefore, P E will be P A plus P A complement intersection E again we are applying axiom 3. Because I have decomposed the event E into sum of two disjoint sets two disjoint events and therefore, the probability of P E would be P A plus P A complement intersection E. And since P of A complement intersection E is greater than or equal to 0. Therefore, this implies that from here that your P E is greater than or equal to P A. So, an immediate consequence and I have just put it down. So, that for completeness sake otherwise one can immediately once you have proved this result you can conclude this result immediately. So, once you have these three propositions now you can see you can get some more results you can compute probabilities of more interesting and complex events. So, consider this example example this is says that 20 percent of Indian smoke cigarettes 40 percent smoke BD's and 7 percent smoke both cigarettes and BD's. So, what percentage of people do not smoke both? So, you are asking what percentage of people do not smoke cigarette as well as BD. So, now here if I write the event E as set of people who smoke cigarettes and F is my set of people who smoke BD's then I will first compute E union F. So, probability of E union F by the proposition that I have just proved is P E plus P F minus P E intersection F which adds up to 0.2 plus 0.4 this is 20 percent 40 percent and then minus 0.07. So, therefore, this comes out to be 0.53 and I am looking for I am looking for the what percentage of people do not smoke both. So, if this is E union F right. So, E union F remember when I was talking of De Morgan's laws this complement was E complement intersection F complement right. So, this is people who do not smoke cigarettes people who do not smoke BD's. So, the intersection will give me the set of people who do not smoke either cigarette or BD right. So, therefore, I am computing E union F complement. Now, here I have used proposition 2 I think 2.2 where I also I have computed probability of E union F. So, now I want to compute probability of E union F complement which is 1 minus probability E union F and say therefore, this is 1 minus 0.53 which is 0.47. See, we have already you know you have come across notion of continuity of a function on the real line. Now, we want to introduce this concept of probability function as a continuous set function and this is very useful because often when we have to take limits of sequences of a sequence of events and then we need to have this notion of probability as a continuous function on the sets. So, that I can interchange the limit and the process of taking the limit and the probability and taking the probability. So, this is what we are going to formalize sequence of events E n and greater than or equal to 1 is said to be an increasing sequence if E 1 is contained in E 2 contained in E n and so on. So, this is right. So, this is the notion of increasing sequence of events and then similarly, a decreasing sequence would be when E 1 contains E 2 contains E n and so on. For an increasing sequence E n, we define the limit of E n and going to infinity as union i varying from 1 to infinity E i because anyway it is an increasing sequence of this is in fact is E n and therefore, there is nothing new in the definition because these subsets are all events are part of E n. So, therefore, union actually is but we are right because we want to take the concept of you know interchanging the probability and the limit. So, this is why we are doing it. For decreasing sequence E n, we define the limit of E n as intersection i varying from 1 to infinity of E i. Now, the proposition is that if E n is either increasing or decreasing sequence of events then limit P E n. So, here this is the main result that limit probability E n as n goes to infinity is probability limit of E n as n goes to infinity. So, what we are saying is that you can interchange the operation of taking the limit and taking the probability because of continuity of P on the sets. So, let us just quickly look at the proof. Consider the case when E n is an increasing sequence and then we can also go over the proof when E n is a decreasing sequence. So, define new set of events F i as follows F 1 is E 1 then F 2 is E 2 intersection E 1 complement. So, therefore, you see whatever common components of E 1 were there in E 2 they are not there anymore in F 2. So, F 1 and F 2 F 1 is E 1 F 2 is E 2 intersection E 1 complement. So, therefore, you can immediately see that F 1 and F 2 are disjoint and then finally, this way you go on define F n as E n intersection union i varying from 1 to n minus 1 E i complement. So, all the previous sets E i's you take the union and take its complement then you take the intersection E n. So, that is B F n. So, therefore, you can see the way systematically where the way we defined F 1 F 2 F n and so on that these are all disjoint subsets or disjoint events. So, that is what we are saying that F i greater than or equal to 1 are disjoint. Now, union i varying from 1 to n F i is union E i because I am not taking away anything it is only just the common parts I am removing and so making these disjoint, but otherwise all the elements of all the components of E 1 E 2 E n are all there. So, union F i i varying from 1 to n is union i varying from 1 to n E i and therefore, you know in the limiting case also this will be true. Now, probability union E i i varying from 1 to infinity is probability this because this holds. So, therefore, this is also probability union i varying from 1 to infinity F i and since F i's are disjoint I can write this as summation i varying from 1 to infinity P F i since F i's are disjoint. So, I can write this and now this side you see this is nothing but limit E n by our definition. We define that union E i i varying from 1 to infinity is limit E n and goes to infinity. So, this is this and from this side you see this is sigma P i P F i i varying from 1 to infinity which can be written as limit and going to infinity of summation i varying from 1 to n P F i which again here this you will write as union i varying from 1 to n F i and then this will be limit. So, this is E n. This part is also union F i yes you can see it immediately that union F i is E n because E n is an increasing sequence and therefore, this is this. So, now this is what you wanted to show probability limit E n and goes to infinity is equal to limit probability E n as n goes to infinity for an increasing sequence. Now, when the sequence is decreasing then you see E n complements will be increasing because E n's are decreasing. So, E n complements should be increasing and so you apply what we have just proved to this sequence of events E n c. So, probability union i varying from 1 to infinity E i complement is limit probability E n complement and going to infinity. But this union is nothing but intersection of E i i varying from 1 to infinity complement by your de Morgan's laws that we had done in lecture 2 or lecture 1 I think in the previous lecture. So, then they but this is this therefore, intersection E i this thing from here. So, now this is because this is equal to this remember because we have taken these as increasing these are increasing sequence. So, this is equal to this. So, it follows from here that this is equal to this right by our this thing and this you can write as 1 minus probability intersection E i i varying from 1 to infinity because this is complement probability of the complement events. So, 1 minus probability intersection i varying from 1 to infinity E i and this is same thing you are writing probability of E n complement. So, this will be 1 minus P E n. So, this comes to this side and here you have this and this because my this cancels out 1 1 and therefore, this now is by your definition because it is a decreasing sequence. So, by definition i varying from 1 to infinity E i this is limit E n as n goes to infinity. So, therefore, you have proved the result for both and now there will be many occasions where we will be using and I will be referring to because P is a continuous function continuous set function. Therefore, I would be very often exchanging the process of taking the limit and the probability. So, now I will define another new concept which is the conditional probability and we will just see that how starting from the three axioms I am able to develop more and more theory about the probability. So, conditional probability again I will take an example first. So, you have two fair dice, the unbiased dice we can also call fair, all these words are synonymous. So, when you have example we had considered of rolling or tossing two dice then if my event A is that the first die shows two and B is the event that some of the numbers on the two faces is they add up to six. So, but the first die is already showing a number two. Now, the conditional probability that means knowing that the first die has shown a two then if I want the sum of the two numbers on the two faces to be six and now we want to find the conditional probability of B given A. So, therefore, see what will be B intersection A that means A the first die is showing two and then you want the sum of the two numbers to be six. So, that means the other die must show the number four. So, this gives you the intersection of A and B 2 comma 4 and here given that your first die. So, the probability that the first die has shown two then the second die can show any of the six numbers and therefore, the probability of A because this probability is B probability of B intersection A divided by probability A. So, probability A will be that you know first die is showing a number two and the second die is showing any of the six numbers. So, therefore, this add up to here this. So, now the probability of 2 comma 4 just one of the 36 equally likely elements because we are assuming that the die is unbiased it is a fair die both the dies die are unfair are fair. So, the probability of this pair occurring is 1 by 36 and each of this is again equally likely and so each of them has probability 1 by 36. So, that will be 6 upon 36 which will be 1 by 6 and therefore, the outcome is 1 by 6. So, this is how we compute the conditional probability and so we say that B the probability of B given A is the conditional probability of B given that A has occurred. So, once you know that this event has taken place now you are wanting to find out the probability that B will occur. So, therefore, this is the conditional probability of B given A. That means, you want to compute the probability of B once you know that a certain event A has already occurred. Now, so the formal definition would be that if P A is positive that means A is an event which actually can occur then if probability P A is greater than 0 we say that probability B condition A is equal to probability of A intersection B divided by probability A. So, this is the definition and this is valid one because we have already assumed that P A is positive. So, here you can so you can immediately say that this implies that probability you can also write another way of writing probability A intersection B is probability A into probability B condition A. So, this is a very important formula which we keep using very often and so let me illustrate this concept through an example. Now, in the game of bridge 52 cards are dealt and are equal to the 4 players that means each player gets 13 cards and so the 4 players are A B C and D. A and C have both been dealt 7 clubs amongst them. What is the probability that B will have 4 clubs? This is the question. So, now the event that has already occurred is that A and C both together have 7 clubs amongst them. Now, we want to find out the probability that B will get 4 clubs. So, here again I am using multinomial coefficients and I am saying that the number of ways in which 4 clubs out of see because they have been dealt 7 clubs amongst them. Number of ways in which 4 clubs out of remaining 6 they should be 6 sorry they should be 6 because 7 clubs have been dealt to A and C. So, you are left with 6 more clubs and therefore, this will also be number 6. So, now you want 4 of the out of the remaining 6 clubs you want 4 of them to go to player C player B and so if he gets 4 clubs then that means since he is getting 13 cards. So, after the remaining 19 so this will be then 20 sorry 20. So, from the remaining 20 he should get 9. So, I have to correct the thing here also this is 6 and this is 20. So, I have made the I have computed this probability by using the multinomial coefficients that you know there are 2 sets of cards and so the choice is from 6 4 out of 6 and 9 out of 20 and the total ways in which he can be dealt 13 cards out of the 26 cards because 26 have already been given to A and C. So, out of the remaining 26 he has to get. So, total number of ways which in which that can happen is 26 choose 13 and this is this. Now, you have to explain this in terms of so I am saying that the conditional probability of B getting 4 clubs when A and C have already been given 7 clubs is given by that number and then I would like you to now because yeah. So, just figure it out how this will fit into this definition and I will come back and then discuss with you again if you have figured it out as to how this fits into the definition. Now, let me give you another extension of this concept of conditional probability and then so I just want to point out here that you know Bayes is the name. So, therefore, the apostrophe will be put after S, Bayes is the name of the statistician and so Bayes formula which we will as we go on we will show that it is a very useful way of computing certain probabilities, the conditional probabilities of course and so we will continue with telling you what the formula is and then taking some examples to illustrate how it is. So, suppose E and F are two events then as already shown to you I can write E as E intersection F union E intersection F complement since F and F complement are disjoint and also E intersection F and E intersection F complement are disjoint. And therefore, immediately we had concluded that P E can be written as P E intersection F plus P E intersection F complement. Now, we can generalize this formula because if I have n mutually disjoint E of E intersection F i, if you remember this then that means I am giving you a formula for computing probability of E by conditioning E on the F i's. So, if I have a set of these mutually exclusive F i's such that they add up to their union is equal to my whole sample space omega then I can decompose P E in this way and then using the conditional probability formula for E intersection F i, I can write this as this and now this is a way of computing P E by conditioning E on the occurrence of F i's into F i. Or in other words you can also look upon this as a because your P F i's summation this will be equal to 1 remember because this is equal to this they are disjoint. So, probability of union F i's will be some of the probabilities of P F i's I mean and therefore, that will be equal to 1. So, this is equal to 1. So, I can treat probability F i's as weights and therefore, this formula is saying that P E is the weighted average of probabilities E condition F i by the weights P F i. So, whatever conditioning event you take here you multiply it by the corresponding weight and this. So, in the many ways of looking at this, now why I am discussing this is because now suppose that E has occurred, suppose now that E has occurred then you want to find out the probability of occurrence of F i. See I have changed the role of the two events saying that suppose E earlier I was computing E given dependent conditional on the occurrence of the events F i's. Now, I am saying that if I already know that E has occurred then what can be the probability of the occurrence of a particular F i because see the thing is that F i intersection F j is empty and union F i is omega. So, therefore, only one of the events can occur here exactly because they are mutually exclusive. So, two events cannot occur at the same time. So, this has occurred. So, in that case what we are saying is that when E has occurred I want to compute the probability of F i occurring. So, that means I want to compute the probability F i given E that given that E has occurred. So, again using this formula that I just obtained this will be P F i intersection E divided by this because probability E I have just computed for you is given by this and this is F i intersection E by my definition of the conditional probability. So, this is this. Now, P F i given E is the posterior probability of F i when E has occurred. So, I am going to call this as the posterior probability of F i given that E has occurred and so I will now take up an example to show you what we mean by this. So, I will give you now an example of computing the Bayes probability and here I am considering this example and you know even in the some other previous examples in the other lectures I have mentioned S R which means that I am taking these examples from the book by Sheldon Ross and I will give you a proper there are two books by Sheldon Ross which both I have referred to while preparing this lecture. So, I will give you the proper references at the end of the this section. So, now this let us understand this problem very well. There is an insurance company it divides its clients into two classes. See there is somehow there is this information that a set of people are more accident prone than some others. There is some ways of computing this and so maybe because of the past history or something you know if somebody has had lot of accidents and somebody has not had then you know because insurance companies they depend on this kind of data. So, let us say that class one is the set of people who are accident prone and class two is the set of people who are non accident prone that means they have had very few accidents in the past right. Now, in a period of fix one year. So, the whole thing that we are looking at is during given period of one year fix period. The probability of accident prone the probability of accidents for class one of people is 0.4 that means the probability of an accident prone having an accident is 0.4 and the probability of a non accident prone person having an accident is 0.2 in that fixed period of time that you have set right. And it is also known that 30 percent of the population is accident prone that means the number of people being in set one that probability is 0.3. Now, suppose the event that we are looking at is that a new policy holder will have an accident within a year of purchase of the policy. So, there is a person who has just bought a policy insurance policy and now you want to compute the probability that this person will have an accident within the year within the year of purchase of the policy. So, I will again using this decomposition principle I will say that probability E can be written as probability E condition on I into probability I plus probability E condition on two set that means this is non accident prone and this is non accident prone. So, probability E condition on two and then into probability two that means the people having accident right. When I am say P 2 I mean that probability of a person whose non accident prone having an accident and here this is the probability that an accident prone person has an accident and I have those probabilities right I have this probability. So, the whole thing is accident for set of people in say I which I am writing in short home as P I and for this I am writing P 2. So, this we can easily compute because this conditional probability is given to me which is 0.4. Given that the person policy holder is accident prone he has an accident that probability is 0.4 and the probability that the person is accident prone is 0.3. So, therefore, this number is 0.4 into 0.3 plus the probability that the person policy holder has an accident when he is non accident prone that probability is given to me to be 0.2 and then the probability that he is non accident prone is 0.7 right. So, when you compute this number this comes out to be 0.26 that means a new policy holder having an accident within a year of purchase of the policy is 0.26. Now, I want to compute this conditional probability I given E that means an accident. So, when you look at the intersection. So, this will actually be by our definition I intersection E divided by P E right which means that see if you just read this this intersection of two events says that a new policy holder that means a person who has purchased a policy currently and is accident prone will have an accident within a year of purchase of the policy right. If you read carefully the event I intersection E E simply says that a new policy holder that means a person who has just purchased a policy will have an accident within a year. I says that if the person is accident prone. So, therefore, the intersection will be that in accident prone person policy holder will have an accident within a year of purchase of the policy. So, that is this and this is divided by P E right. Now, what I will do is I because so I given E condition E I do not have, but I have this probability. So, therefore, using this definition of remember I gave you the alternate definition of intersection of the set. So, this is P I intersection E I can write this as P E I P E given I that means accident prone and then the person is having an accident within a year into P I divided by P E. P E I have already computed as 0.26 and so here this is 0.4 into 0.3 this I already know right remember from here the first part and so this is. So, now this goes up to 0.4615. So, earlier probability for accident prone earlier probability for an accident prone person to have an accident was for earlier probability for accident for I was sorry this is 0.4 right. Probability of accident for an accident prone was 0.4, but now the posterior probability that means after knowing that the person has had an accident remember I am computing now the policy holder has had an accident and he is. So, the probability of the accident prone person having an accident knowing that he has had an accident has now gone up to 0.4615. So, the posterior probability of accident for the accident prone has gone up from 0.4 to 0.4615. So, actually this is a little complex concept and I will keep coming back to it through examples to make sure that you sort of get a better feeling for the base computation of base probability. Now, the moment you define conditional probability you then come to the concept of independent events. So, let me just motivate the reason for the motivate this definition and so here you see what is being said is that if this conditional probability of e given that f has occurred is equal to P e that means that the occurrence of f has no effect on the occurrence of e right because this probability has remain unchanged even though I know that event f has occurred. So, these are the kind of things because you want to know what kind of events can have effect on the occurrence of some events and so on. So, this is also a very important concept and you need to know how to compute it and so therefore, if this is so then you see by definition of the conditional probability of e given f is P e intersection f divided by P f, but then we are saying that this is equal to e probability of e right. That means I am just trying to formalize this concept that if occurrence of an event is not dependent in some sense on the occurrence of another event then we would be defining this concept of two events being independent and so this is essentially this is what you will say that if this is equal to this then occurrence of f has no effect on the occurrence of e and now by our definition of conditional probability I will write it as this, but since this is equal to P e this implies that probability of e intersection f is P e into P f and so this is our concept of two events being independent and so I will just write down the definition 2.9 which says that two events e and f are said to be independent if P of intersection f that means probability of e intersection f is equal to P e into P f right. So, let us look at this example a card is selected from an ordinary pack of 52 playing cards. Now, e is the event that selected card is a king fine and then f is the event that the card is a club then you can immediately show that the events e and f are independent which you can also argue intuitively also that king need not be a club and so on. So, let us now show that this gets validated by the definition also see to compute the probability of e that the selected card is a king. So, since there are four kings in the pack. So, therefore, the probability of P e is 4 by 52 and selected card is a club. So, 13 cards belong to the club. So, therefore, the probability of f will be 13 by 52 and P of e intersection f is probability king of club. So, that is only one card and therefore, probability of that is 1 by 52. So, therefore, P of e intersection f is P e into P f which is 4 13 by 52 is 1 by 4 and so 4 into 4 gets cancelled and so P e into P f is also 1 by 52. So, the two probabilities are equal and therefore, e and f are independent. Now, similarly if you look at this other example two fair dies are rolled and if e 1 is the sum of two numbers is the event that the sum of two numbers is 5 and f is the event that the first die shows 3. Then you see e 1 intersection f will simply be the event 3 comma 2. That means, I must have the pair 3 comma 2 then only because I know that this is already 3. So, then this number must be 2 in order for the sum to be 5. So, this again probability is 1 by 36, but P e 1 is what that means e 1 is the sum of the two numbers being 5. This is 2 3, 3 2, 1 4 and 4 1. So, these are the four possible pairs which will give me the sum as 5 and so this probability will be 4 by 36. Each being equally likely and the first phase showing you 3 is 1 by 6 because here again each phase is equally likely each number is equally likely. So, then this probability of e intersection f is not equal to P e into P f. So, we will say that e and f are not independent events. Now, consider the event e which is the sum of two phases is 7. So, if I change the event from e 1 to e which requires that the sum of numbers on the two phases is 7, then you can see that these are the pairs which are valid for this event and therefore, they are 6 in number. So, probability e will be 6 upon 36 which is 1 by 6 and probability f already we have seen which says that the first die must show number 3. So, that probability is again 1 by 6 and so now you see that if you look at the event e intersection f since f says that the first die has to show the number 3. So, then for the sum to be 7 the second die has to show the number 4. So, this is the only possible element of omega which is favorable to e intersection f. Therefore, here the probability of e intersection f is 1 upon 36 which is equal to P e into P f. So, therefore, you see you can see how one can define events which given the event f for example, what are the events which are independent of f and which are not. In the earlier case when we took the event to be the event that the sum of the 2 faces is 5, then you saw that P e and f were e 1 and f were not independent, but if I take the sum to be 7, then it has found that the 2 events are independent. And now if you try to define another event e 2 say which say that the sum of the 2 faces is 8, then again you can show that f and e 2 will not be independent. So, you should now play around and get a feeling for the definition by constructing different take the same experiment that is throwing up of 2 dies and then try to construct a set of events which are independent which are not and so on. So, you can get a better feeling. Now, continuing with the again as you see we define a concept, then using the axioms we come up with new propositions and you see theory keeps developing. So, if 2 events are independent, then you will say that e and f complement are also independent. It is very intuitive and you should be able to rationalize it, but in any case we will prove it analytically also. So, the proof is simple if e and f are independent, then by definition probability of e intersection f is p e into p f. And since we have already seen that you can write p e as probability of e intersection f plus probability of e intersection f complement. So, then we want to compute this. So, this will be from here probability of e intersection f complement will be p e minus from here p intersection f, but then because e and f are independent. So, p intersection f I can write as p e into p f and so you see this comes out to be p e multiplied by 1 minus p f I can take p e common outside and 1 minus p f is f probability of f complement. And therefore, probability of e intersection f complement has been shown to be equal to p e into p f complement. And therefore, by our definition e and f complement are also independent and which says that if occurrence of f has no effect on the occurrence of p and when I say occurrence that means the probability we are talking of. So, e and f being independent says that occurrence of f has no effect on the occurrence of e. Therefore, non occurrence of f should also not have an effect on e. So, that is what we have concluded that if e and f are independent then e and f complement are also independent. Then now here I am again just making a statement and I want you to construct examples for yourself. So, what we are saying is that if suppose e and f are independent and e and g are independent then e may not be independent of f intersection g. Now, this may not sound very intuitive, but it can be you can construct sets e f and g to show this. So, that means now I have already told you to explain this the example that I took earlier and now I want you to also construct sets e f and g construct an experiment and then construct the corresponding events e f and g to show that if e and f are independent and e and g are independent then e may not be independent of f intersection g. And in fact, you can you know remember the throwing of two dice that I have taken that experiment and there also you can construct sets e f and g to show that this is valid, this statement is valid. But then I can come back and discuss some examples with you fine. Now, if you want to extend the concept of independence of two events to more than two then I will just show you how things start becoming difficult. Even if you just take the want to extend the concept to two three sets that means if three events e f and g are independent then we require that not only should the probability of intersection of e with f with g and intersection with g should be equal to the product of the individual probabilities. But when you take two at a time these three sets from these three sets then p of e intersection that means the any two subsets here should also be independent. It is not just that the product of the that is the probability of the intersection of the three sets is equal to the product of the individual probabilities. But when you take two at a time then also that should be the condition of independence should be satisfied. So, all four conditions have to be met before you can say that the three sets e f and g are independent. So, now here again we need to construct and you can immediately see that the moment you have you increase the number of sets and you want to talk about their independence then this will get more complex and the number of conditions will go on becoming larger and larger. So, therefore, I will just leave it at this point for this thing and then one can always look up advanced text books where they will give you conditions for any set independence of for any set of for any number of subsets or events. Now, here again interesting way to try to understand this concept would be to construct sets e f and g which may satisfy not all the conditions, but only some to show that all four conditions are necessary for independence of the three events e f and g. So, I will now take up set of exercises too and then we will come back to constructing examples for these situations.