 Hello everyone, this is Alice Gell. In the previous video, I did a brief review of two important concepts on conditional independence and conditional independence. In this video, I'm going to discuss two properties about these two concepts. The first one is that how can we use these independence assumptions to derive a compact representation of a probability distribution. The second one is how can we use these definitions to verify the independence relationship between some random variables given a probability distribution. First, let's think about how can we take advantage of these independence assumptions to derive a compact representation of a joint probability distribution. I have two questions for you to think about on this slide and the next slide. Both questions are considering the same scenario. So the scenario is that we have three random variables and let's assume these three random variables are all Boolean random variables. Now the first question is asking you to think about what is the minimal number of probabilities we need to specify the joint distribution between these three random variables. This question is the same on both slide. And then for the next question, we'll assume that the three random variables have some sort of independence relationship. And given that assumption, we'll again try to calculate what is the minimal number of probabilities required to specify the joint distribution. Now on this slide, for the first question, the independence relationship we assume in part two is that we assume A, B, and C are independent. So this independence relationship is for each pair, right, which means A and B are independent, B and C are independent, and also A and C are independent. Think about this question for yourself a bit, pause the video, and then keep watching for my answer. Here are the answers. So for part one, when we know nothing about these random variables, when we don't make any assumptions, we need at least seven probabilities to specify the joint probability distribution. There's a general formula you can use for this. If you have n random variables, then in general, to specify the joint probability distribution over these n random variables, you need at least to specify two to the power of n minus one probabilities. So now for part two. Part two, we assume that A, B, and C are independent. Given this assumption, what's the minimal number of probabilities? Well, in this case, we only need to specify three probabilities. To see a longer discussion of how I derive these numbers, please watch a separate video about this slide. The important message from this question is the following. When we know nothing about the random variables, we need to specify quite a few probabilities in order to specify the joint distribution. In general, the number of probabilities we need is exponential in the number of random variables. However, if we know some independence relationship between these variables, then we need fewer probabilities to derive the joint distribution. In this case, our number decreased from seven to three, which is not a whole lot because we only have three random variables. But if we have n random variables, the saving is going to go from two to the power of n minus one to n, which is, which can be quite a large saving. Now, the previous question told us that knowing some independence or unconditional independence assumption can help us reduce the representation, help us derive a more compact representation. What about conditional independence assumptions? In this question, part one is exactly the same. So again, we will need seven probabilities to represent the joint distribution. Now, part two is a little different. In part two, instead of knowing about unconditional independence, now we know a conditional independence relationship. So we assume that A and B are conditionally independent given C, given C. Now, given this assumption, what is the minimum number of probabilities that we need to specify the joint distribution? Pause the video and think about this yourself and then keep watching. The correct answer here is that I need five probabilities to represent the joint distribution given this conditional independence assumption. Again, watch the separate video for a longer discussion of how I derive these numbers. The important message here is the same as before. So by knowing some conditional independence assumptions, it saves us, right? We can use fewer probabilities to represent the same joint probability distribution. Let me stop here for this video. After watching this video, you should be able to explain how knowing the independence relationships between some random variables can allow us to derive a compact representation of a joint probability distribution. Thank you for watching. I will see you in the next video. Bye for now.