 Hi, I'm Zor. Welcome to Unizor education. This lecture is about probability distributions, in particular, discrete probability distribution. This is part of the advanced course of mathematics for teenagers, for high school students, which is presented on Unizor.com and that's where I suggest you to watch this particular lecture because the website has the comments, notes, basically, which add something significant to whatever I'm talking about and it's always beneficial to read the notes after all before you listen to the lecture. Alright, so back to discrete probability distributions. First of all, what do we mean when we are talking about distribution? Well, we can distribute money, we can distribute weight among certain things. Well, probability is a measure, which is very much like money, like weight, like lengths, or whatever else. So we can distribute the probability and in basic finite case but we usually have, we have a sample space which basically models our random experiment. This sample space contains a certain number of elementary events, and elementary events in the finite case, which basically models the n different outcomes, possible outcomes of the random experiment. Now, with each elementary event, we have certain non-negative constant, which is called its probability, and its probability basically means it's a limit of the frequency of occurrence of a particular event, which means if we will continue conducting our random experiment under identical conditions, infinite number of times, or actually conduct this experiment a certain number of times and the number is increasing limitlessly. So the ratio of the number of occurrences of a particular event to the number of experiments would have a limit, which is actually its probability. So the only requirement for these probabilities is that their sum is equal to one. So we always have the probability measured as something which is a unit, a one, a combined weight of one, and we are distributing these ratios among different elementary events. So this set of probabilities of each elementary event is actually the probability distribution. We are distributing the weight of one among different elementary events, like one half to this, one quarter to this, etc. Okay, so this is the probability distribution in case we're talking about sample space or random experiment, etc. Now, what if we're talking about random variable? Well, random variable is a function which is defined on these numeric, on these elementary events. It's a numerical function. So the value of c can be either e1 or e2 or e3 or etc. or en. So the value of, well, let's do it differently. Let's put it. It's a function which means on every case elementary event it has certain numerical value. Let's say it's xk. It doesn't really matter. Now, what can we say about this particular random variable? If it takes the value of xk when our random experiment ends up in elementary event ek, it means that the probability of our random variable to be equal to xk equals to the probability of elementary event ek, which is pk. So basically the same set of probabilities, which we have defined for our elementary events, defines actually the distribution of the probabilities for our random variable among its values. So the value of xk is taken with the probability pk. So this is also a distribution of probabilities. So all we are talking about right now is terminology. So what is distribution of probability? Well, in the finite case it's just a set of probabilities. Each of them is non-negative and sum of them is equal to 1. And they, well, they define the distribution. All right? Okay, now why do we call it discrete distribution? Well, because these values are separated from each other. So it's not like all the real numbers. There is no such thing as a separation between two consecutive real numbers. There are no separation there. Basically have a separation of zero, if you wish. But in case of integer numbers, like we have, for instance, n finite numbers. Well, if you have n finite numbers, then there is always a difference, some distance between them. So these are separated from each other. And that's why we call them discrete. Now, this is a finite case. How about infinite case? Well, exactly the same thing. We are talking about discrete probability distributions. Even if this number of elementary events is infinite, we still can have some distance between consecutive numbers. Now, in what cases it's possible? Well, for instance, it's possible if we have, for instance, a certain countable number, infinite but countable number of elementary events. And here's the experiment. Consider my elementary events are natural numbers. Infinite but countable number of elementary events. Now, our probability distribution, you know the requirements, right? It should be non-negative and some should be equal to zero, sorry, to one. And there are infinite number of these probability distributions. So they cannot be like one half, each of them, right? But what can be done is the following. P1 is equal to one half, P2 is equal to one quarter, P3 is equal to one eighths, etc. Pn is equal to one over two to the power of n, etc. Infinite number of probabilities. Remember geometric progression, infinite geometric progression? This is the example of it, right? Now, what's the sum of them? P1 plus P2 plus etc plus Pm plus etc to infinity. This is one half plus one quarter plus one half, one eighths plus etc plus one over two to the nth, etc. And what is it? It's equal to one, remember? First we take one half, then one quarter, then one eighth, then one sixteenth, etc, etc. This is an infinite geometric progression and its sum is equal to one. If somebody forgot what geometric progression is, return to one of the lectures, I do actually explain it in one of those. Alright, so this is an example of the finite or infinite but countable probability distributions, which we can call discrete. Now, what's very useful in many cases is to present graphically this distribution of probabilities. And in the discrete case, it's actually very easy to do. Here is how. Back to our original definition. Okay, here is our sample space with elementary events and the corresponding probabilities. And sum of these probabilities equals to one. Now, what can we do graphically? Here it is. We have n different elementary events, so I will put numbers 1, 2, 3, 4, n. Now, let's say this is 1. Now, between 0 and 1, I will build a rectangle. The width is 1, obviously, but the height is p1. p1 is less than 1, right? So it's something like this. Then between 1 and 2, I will put a rectangle with the height p2, p3, p4, p5, etc. Now, sum of the heights, or if you wish, which is actually better, sum of the areas is equal to 1. So, now, why the area is equal to 1 as well as the sum of the heights? Well, because the widths of each rectangle is equal to 1, right? So we multiply the widths, which is 1 by the height. So the height numerically is equal to area. So the area under this staircase, if you wish, is equal to 1. So we can change the probabilities. One can go higher, another could go lower, but in any case, whatever we change, the sum of the area under this staircase is always equal to 1. And this graphically shows that in this particular case, the probability is distributed in such a way that the highest is the first event, elementary event. Now, what if our experiment is rolling the dice? We have six different probabilities, and each of them is the same, is equal to 1, 6, right? So in this particular case, we will have 5, 6, and we will have all of them of the same height equals to 1, 6. This will be area. And the total area of these six rectangles is still equal to 1. Now, how about our example when we used to have this infinite number of elementary events? And the probability is 1 half, 1 quarter, 1 8, etc., 1 to the 2 to the power of n, etc. How, in this case, the probability distribution will look graphically? Well, this is actually something which I was trying to do the first time. This is 1 half, 1 quarter, 1 8. Each one is half the size of the previous one. So this is my probability distribution. And it goes infinitely, actually, right? But the sum under this letter is still equal to 1. By the way, this is a very interesting random variable when you define the value of the random variable of the event EIS, which is actually the number i, is exactly the number i. So we have the random variable, which takes the value i with the probability equals to 1 over 2 to the power of i. Now, this is an interesting random variable. We just built the graph of probability distribution of this particular variable. Now, what's wrong with having an infinite but countable number of various? Well, basically nothing. You can, for instance, calculate the mathematical expectation of this variable. It's not part of this probability distribution topic. But I mean, it's still an interesting thing. I actually suggest in the notes, as an exercise, to have the calculation of the expected value of this particular random variable. You know, the expected variable is the value times its probability, another value times its probability, et cetera. Now, in this case, the value is 1 and probability is 1 half. The value is 2 and probability is 1 quarter. The value is 3 and probability is 1 eighth, et cetera. The value is n and probability is 2 to the power of n, et cetera. Now, this is my expectation of our c. How to find out what this particular sum is supposed to be calculated? Well, actually, there is a little trick and here it is. If you call it s, this sum. Now, what about s divided by 2? What is it? Well, it's 1 over 1 quarter. But let me just shift it 1 over 1 quarter, then 2 over 8, et cetera. And that would be n minus 1 over 2 to the n, et cetera, right? If you subtract one from another, you will have 1 half s minus s over 2. You will have 1 half. 2 fourths minus 1 fourths is equal to 1 fourths. 3 eighths minus 2 eighths, 1 eighths, et cetera, et cetera, n over 2 to the n minus n minus 1. So, it will be 1 over 2 to the n, et cetera. And we know that this is equal to 1, right? This is a plain geometric progression. So, s minus s over 2 is equal to 1 from which we derive that s is equal to half of s is equal to, so s is equal to 2. So, the expectation of this variable, random variable is 2. So, it's a normal thing. I mean, there is nothing wrong with introduction of our infinite but countable number of elementary events and probabilities. Probabilities are still separated from each other and this is why it's called discrete. So, this is an infinite discrete distribution. All right. That's it for today. Next lecture will be about continuous distribution which is a little bit more difficult. So, that's it for today. Try to read again the notes for this lecture on unison.com. I strongly recommend you to do it. That's it. Thanks very much and good luck.