 Welcome to lecture on Entropy, Joint Entropy and Conditional Entropy examples. At the end of this session students will be able to calculate the different entropies. Now, before starting with the actual session let us pause the video and think about what is the relationship between the different entropies. There are different entropies like source entrapies, receiver entrapies, conditional entrapies, joint entrapies right. So, in the last video we already studied about the entrapies, conditional entrapies and joint entrapies. So, that relationship is nothing, but this one is nothing, but the joint entropy h of x, y is equals to conditional entropy plus source or destination entrapies. These two equations are there. Now, if you consider x as a source then the this one is a conditional entropy that is y given x and if you are considering this as a destination y then this conditional entropy is nothing, but the x given y. So, you can find the joint entropy in both the case if you know the conditional entropy these two and the source entropy and destination entropy. Now, let us see one example of entropy because you already studied the entropy, conditional and joint entropy in the previous session. In this session we are going to study about the different examples of it. Now, the first example is about the entropy. So, how to calculate entropy of a source or system given? Now, this example consider one, a source generates information with the probabilities p 1 equals to 0.1, p 2 equals to 0.2, p 3 equals to 0.3 and p 4 equals to 0.4 means it has 4 messages with the probability 0.1, 0.2, 0.3 and 0.4. Now, we have to find the entropy of the system and the next one is what percentage of maximum possible information is being generated by this source. Now, this source is generating 4 messages with the probability the given we have to find first the probability of the entropy of the system. Now, we already studied equation of the entropy that is given by h equals to summation of k equals to 1 to number of messages. Now, in this case the message is 4. So, it is up to 4 probability into log of 1 by probability. So, we have probability of all messages over here. So, if you put in that equation the equation is like this that is a 0.1 log 1 by 0.1 this is about the probability of first message plus probability of second message likewise up to fourth message 0.4 log of 1 by 0.4. And the answer will be 1.8464 bits per message. So, how we will get this answer? Most of the time you will get the wrong answer why? You have to use log to the base 2 over here. How to do that? If you are performing on the calculator you consider a log whatever the answer over here you are getting log of that divided by log 2 then you will get this single answer. Similarly, for other this then you will get the correct answer is that 1.8464 bits per message. Now, you have the entropy of a system. Now, the next question they asked is what what percentage of maximum possible information is being generated by the source means you have to find the first what will be the maximum entropy. So, we already have the equation for h max in the previous video lecture we already saw that h max equals to is nothing, but the log m we already derived that equation m is nothing, but the number of messages. So, in this case the number of messages are 4. So, log 4 you will get the 2 bits per message. Now, you have the maximum possible information you have the information of the entropy of the system. Now, you have to find what percentage of that. So, that will be calculated as h that is the entropy of the system divided by maximum entropy of the system into 100 that will give the percentage equals to 92.32 percent. So, this will be the percentage of maximum possible information is being generated by the source right. Now, let us take one more example of the entropy. Now, an event has 6 possible outcomes with the probabilities p 1 equals to 1 by 2, p 2 equals to 1 by 4, p 3 equals to 1 by 8, p 4 equals to 1 by 16, p 5 equals to 1 by 32 and p 6 equals to 1 by 32 means you have one event which is giving the output 6 outputs with the probability they given. They asked you to find the entropy of the system means you have to find the h. So, we have already formula h equals to nothing, but summation of k equals to 1 up to m. Now, in this case m is number of outcomes are 6. So, it is 6 probability log 1 by probability. Now, if you put the values in this one. So, all the 6 outcomes with the probability. So, first outcome p 1 that is 1 by 2 log of 1 upon 1 by 2. So, that becomes 2 1 by 4 log 1 upon 1 by 4 that becomes 4 similarly like this up to p 6 that is 6th outcome. Now, for that we will get the entropy as much 31 by 16 bits per message. So, it is not completely solved why we keep like this we will see later. Now, the next question they ask over here is this also find the rate of information if there are 16 outcomes per second. We already saw what is mean by rate of information capital r equals to small r into h. In this case small r is nothing, but the number of messages generated per second. Now, in this case number of outcomes per second h is nothing, but the entropy. So, we have the entropy now just calculated over here small r they given that is number of outcomes per second that is 16. So, we have small r we have capital H that is entropy from that we can easily calculate the rate of information. So, r equals to nothing, but 16 into 31 by 16. So, 16 16 get cancelled. So, equals to what 31 bits per second. So, let see now the example of joint and conditional entropy. Now, this is the one example a discrete source transmits message x 1, x 2, x 3 with the probability 0.3, 0.4, 0.3 the source is connected to the channel given in the figure 1 calculate all the entropies means they give you the one system they give the probabilities of that and you have to find all the entropy all the entropy means you have to find the entropy source to entropy receiver entropy joint entropy conditional entropy from this only the information. Now, these are the nothing, but the source these are the nothing, but the destinations probabilities given over here is 0.8 which is x 1 going to y 1, 0.2 is x 1 going to y 2. So, total contribution of the x 1 in the receiver side is how much it is completely 1 if you add this together similarly x 2 it is going to only y 2. So, it is 1 x 3 also going to y 2 and y 3 total contribution is 1 means individual each message is having contribution is how much 1 whether it is number of messages are going to different destination overall probability or overall contribution is supposed to be 1. Now, from this figure we can get the probability of y by y given x means you are finding the output with reference to the input how that see this one probability y given x x 1 y 1 the probability is 0.8 x 1 is having at the output y 2 probability is 0.2 for y 3 there is no information coming to y 3. So, it is 0 like that x 2 x 3 right. So, 0.3 and 0.7 and it is 1 now total contribution of x 1 throughout all the destination sources destinations is 0.1 8 plus 0.2 equals to 1 if you added this together total contribution of x 2 at the destination side is also 1 x 3 is also 1 right. Now, from this you can find the probability of x individually x 1 x 2 x 3 x 1 is 0.3 given x x 2 is 0.4 probability and x 3 is 0.3 is given over here right. Now, you have the conditional probability you have the source probability. Now, let us see how to find the different entropies before that we have to find different probabilities. Now, the joint probability matrix that is probability x comma y can be found by multiplying the rows of conditional probability whatever we derived in the previous slide by source probability. So, we have derived this in the previous slide this is multiplied with the source probability then we will get the joint probability matrix. So, this is a multiplication 0.8 is multiplied with the probability of x 1 this one is multiplied with the probability of x 1 probability of x 2 probability of x 3 similar over here this one is multiplied with the probability of x 3 in this term these are the 0 see does not affect right. Now, you will get the final probability of x comma y that is a joint probability matrix right. Now, from this if you want to find this supposed to be a valid joint probability matrix all the terms are added together that will give you the value 1 if you add this together all are 1 some is 1. So, it is valid joint probability matrix. Now, from this you can easily find the probability of y 1 y 2 y 3. How you just add the column of the this joint probability matrix. So, in that case probability of y 1 becomes 0.24 probability of y 2 is become 0.55 and probability of y 3 becomes 0.21 if you add it together the column each column that gives you the probability of y 1 y 2 and y 2. Now, you have the source probabilities you have the destination probabilities you have the joint probability matrix you have the conditional matrix probability y given x you have to find the one more probability matrix that is a probability of x given y that can be obtained by dividing the joint probability matrix by probability of y. So, we have the joint probability matrix derived in the previous slide just this one is divided by the probability of y 1 y 2 y 3 then you will get the one more conditional probability matrix probability x given y you y 1 divided by this y 2 divided by this like this you will get the this conditional probability matrix. Now, you have all the probabilities now let us find the entropy is. So, source entropy that is h of x we already have the equation probability of x log of probability of x the sign is minus over here this one this equation is similar to the previous equation only the sign is changed. Now, we have three sources x 1 x 2 x 3. So, this one is 3 if you put the values you will get this answer you can verify that one similar marry entropy of y h of y different entropy this one is source entropy this one is destination entropy or receiver entropy right again this is the answer same with the joint entropy in this case you have to use joint probability next conditional h given y again conditional y given x this is the answer you can verify that one these are the references thank you.