 Hi, I'm Zor. Welcome to a new Zor education. We continue talking about random variables, and after the introduction to random variables, which was during the last lecture, let's talk about certain properties of random variables, and today's theme is expected value of the random variables. I do suggest you to watch this lecture from Unizor.com website instead of directly on some other like YouTube, etc., because the website contains notes to this lecture, which is very important to familiarize yourself with. Before or after the lecture doesn't really matter, but it's very important to read the notes. Okay, so back to expected value. Now before presenting this in a purely theoretical form, I would like to have a couple of examples where it would be quite obvious actually what expected value is. Okay, example number one. Okay, the game of roulette. So the game of roulette is when, well, you have a wheel with different partitions. The American version of this game has 36 numbers from 1 to 36, 0 and 00. So 38 different partitions on the wheel. And you're spinning the ball and spinning the wheel and spinning the ball. And after they stop spinning, the ball actually falls into one of these partitions. Now, you have a choice to bet on something, but we will consider only one particular version of this type of a game when you are betting on a certain number. Let's say you are betting on number 23. It doesn't really matter. So if the ball falls on the number 23, you win. If it doesn't, you lose. Well, obviously the chances to fall on the number 23, if you have 38 different partitions are very small and the chances to lose are very large. So we have to equalize it with some payoff. Now, the standard payoff is if you win, so if the ball actually falls on the number you predicted, you will have 36 times what you have bet. And let's say you bet a dollar. So you're getting 36 dollars. But if it's not 23, you are basically losing your bet, which is $1. So that's the game. Well, a random experiment is basically play the game. So what are the results of the random experiments? These are results, 38 different results. One of them is successful and one of them is not successful. Now, considering these are 38 results with presumably equal chances to occur, because the wheel is made quite correctly and the ball is round, etc., etc., so all these partitions have equal chance to occur, which means since there are 38 of these partitions, we have to assign the probability of 138 to each result. Okay. Now, what does it mean that the probability is 138 of each result? Well, it means that if you have, let's say, n different experiments where n is a really large number and the larger the better, then 138 of this will be, let's say, 23 and n 38 of this would be number one and the n 38 times will be number, let's say, 00, etc., etc., so each result of these out of these 38 will occur approximately n divided by 38 times, right? That's what it means that out of n experiments, we have 38 different results with equal chances, so approximately 138 of n results will occur with a concrete number, let's say, number 23 or another number. All right, so, which means that if we will conduct n experiments and n is relatively big, in n time, n divided by 38 times, we will win 36 dollars. Let me just change it. In all other cases, and how many all other cases, if n divided by 38 is winning, then 37 times n divided by 38. All other, they are the losing results, so you will lose $1, so you have to subtract times $1. So, what's the result of this arithmetic? Well, the result is minus n divided by 38, right? This is 36 and this is 37, so it will do minus 1n, all right? So, this is approximately the monetary result of n games, approximately, which means that the n is, as the n is increasing, it will be closer and closer to this, but not necessarily. I mean, you can play 10 games and you can win 10 times. Yes, in theory it is possible, although unlikely, but if you will go through, I know, 1 million games, more or less it will be approximately something like this. So, there are certain theorems which are basically saying there are estimating the closeness of the real results to this average result. Now, okay, so, if this is approximately the monetary result of your playing n games, then average per game would be divided by n, right? So, per game average result is this. So, if you bet $1 on a game, then average result of this game would be minus 138, which means you will lose, basically, and casino will win. And that's how casino is making money on averages. You see, if the number of people is great and everyone is playing many different games and the average per game is something like this, then as the number of people grows, the result will be closer and closer for the casino to win 138 of each dollar. Well, basically, that's an example. And what I wanted actually to say is that this average, obviously, is not necessarily something which really occurs. I mean, if you bet a dollar, you either lose a dollar or you win $36. You will never lose 138 of a dollar. But again, among many, many different games, if you average it by the number of games, that would be the result with the number of games relatively large. Now, ideally, I mean, speaking mathematically, if n tends to infinity, then the net result of the average per game would tend to minus 138. Okay, so that's this particular example. Let's consider another example. Actually, methodically, I think it's very important to explain a couple of concepts through examples first and then go to a more abstract description of what exactly we're talking about. So that's what basically I'm doing. So the second example is, all right, in the beginning of the course on probabilities, I was talking about the person Cavalier de Meure, who lived in France very, very long time ago, and he basically invented the game of dice, or one of the first people, actually, who was playing this game using certain, well, scientific approach, if you wish. So he was trying to modify the basic game, and his proposition was, let's play the following game. We will roll four dices, four dices, or we will roll one dice four times, doesn't really matter. And as a result of this experiment, we will look for the number one appearing among all these four dice. If it appears, then the Cavalier de Meure, who actually invented this particular game, would consider himself a winner. If he loses, obviously, he will pay whatever the amount he bet. So let's say again, he bet one, I shouldn't say a dollar, there were no dollars at the time, whatever the one is, one doublon or whatever. So he paid one, he bet one, and now we have four dice, and we are looking for number one to appear among these four dice. All right, how can we calculate the probability? Well, we actually talked about this particular problem during the regular lectures on probabilities. And the really convenient way to do it is, you see, one can appear on one dice or two dice and three or four or four dice. I mean, each one of these events has its own probability. It's easier to calculate the opposite probability that there are no ones, which means that the probability of not having any one on four dice is the combination of probabilities of having not one on each one of those, right? So the probability is five over six for the first dice, not to have one, right? Out of six different sides of the cube, five are not one, and only one side is one. Now, but with each of these we have five different versions for the second dice, five different sides for the second dice, and five different for the third and five different for the fourth, right? So these are number of combinations which are considered to be the non-one-containing combinations. So it's five times five times five times five, and the number of total combinations is six times six times six times six times six. So that's what actually the probability of not having dice, which is approximately zero four eight two three. Okay, so this is the probability of not having any ones on the four dives, and obviously the probability of having one, so this is not one, so the probability of having one, either one or two ones or three ones or four ones, is the opposite of this, which is one minus five six to the fourth, which is approximately zero point, zero point fifty one seventy seven. Okay, now, so here is the game, here is the probability for Cavalier de Meire to lose the game when there are no ones, and this is the probability to win the game. As you see this probability is greater than one, greater than half, so it looks like it will be willing, right, winning. So how can we calculate it in quantitative terms? Well, again, let's consider we are playing and games against Cavalier de Meire, or he's playing and games against us. Now, if these are the probabilities, probabilities of winning and losing, so he will win approximately in this number of cases. He will win his one Dublon or Louis Dorre or whatever else, and he will lose, I put the minus sign in these number of cases. So this will be, since we are playing with one Dublon only, so this would be his total average, not average, sorry, approximation of his net sum, which he will basically gain in this case after n games, right, which is equal to n times 0.0354. So this is the net amount he might approximately gain after playing n games if n is a significantly large number. So per game, excuse me, so per game, we have to divide it by n, it will be 0.0354. So this is his average win per game. It doesn't mean that he will actually win this particular amount every game, obviously not. He can either win one or lose one, not obviously this amount, but on average, if he will play a large number of games, then his average win per game would be this. So this is how I'm basically approaching the concept of expected value. I'm calculating the approximate value of winning or losing or whatever per one particular random experiment. That's the idea and that is a characterization of the random experiment itself or the game in this particular case. So if my average per game is positive, like in this particular case, then it makes sense for the person actually who play this way, like Cavalier de Meire in this case, to play the game because since on average he will be winning each game. So even if he will win something and lose something, the winning will exceed the losing, eventually, as the number of experiments goes to infinity. Doesn't mean that he will win during the first five games. He might lose actually. But again, his purpose is to make the number of games as big as possible and that's what makes actually him as a net winner. Same thing with Casino. In a previous game of roulette, which I was talking about, the Casino's purpose is to invite as many people as possible and force them to, well, encourage them, I shouldn't say force, encourage them to play as much as possible because only the numbers will give the net result for Cavalier de Meire or for Casino, the results which is close to this theoretical average. All right. So that's it for examples and now let's go to a pure theory. Now pure theory is going along exactly the same way as I was explaining in the examples. So first of all, we have to describe our random events. So this is, let's call it omega, a combination of certain elementary events, right? Now, we might have, let's say, k different elementary events. Now, in case of roulette, we have 38 different results of the experiment, elementary events, right? In case of Cavalier de Meire, we have only two basically. There is one or there is no one among these four dice. So there are only two. Now, the second thing which we need is the probabilities of these events. So let's say we know the probabilities of these elementary events. Now, in case of roulette, it was 138 per each of them. In case of the game suggested by Cavalier de Meire, when there are only two events, one was 0.51 something and another was 0.48 something. But the sum of them obviously is supposed to be equal to 1. All right. Now, next. Next is we are talking about certain random variable, which is, well, in case of our examples is a winning, basically. The amount you win. Well, okay, it depends on the elementary events, right? In case of roulette, when we have 38 different versions and we are playing on number 23, on number 23, we have number 36 as a winning. And all others had minus one because we are betting $1, right? And that would be the losing result. So in case of roulette, we would have minus one, minus one, minus one. On one particular case, we will have plus 36 and then minus one, minus one again until the end. In case of Cavalier de Meire game, we will have basically one and one. One we lose or one we win, something like this, right? Anyway, we have certain results and I will put it x1, x2, etc. xk. These are the random variable. This is random variable, which is defined on these elementary events. So I can actually say that the value of random variable on event e i is equal to x i. Where c, the grid letter c is usually used for random variable in theory of probabilities. All right. So we've got that. This is our problem. Now, question is, what's the value of this random variable per experiment if the number of experiments goes to infinity? That's what we're looking for. Our average win per number of experiments if the number of experiments is very large. I mean, that's what gives us the feeling for kind of evaluating the game. Is it worth actually to play this particular game in the casino, right? So the average value of this random variable as the number of experiments goes to infinity, that's what we are interested in. All right. So how can we evaluate it? Well, consider we have n different experiments. Now, if n is a very large number, then what are the results of these experiments? Well, approximately in n times p1 cases, the result would be e1. Approximately in n number p2 cases, the result will be e2, etc. And the n number pk cases, the result would be ek, right? Because that's what probability actually means. If we put these probabilities for these elementary events, the meaning of the probability is. So if this is 138, 138, 138, it means that if n is multiplied by the probability, it will be n divided by 38. We will have one particular result of the spinning wheel. Let's say result 23 or 0 or double zero or whatever else, right? So that's what it means. This is basically the definition of the concept of a probability. And since the results will be e1, e2, ek, the value of our random variable will be x1, x2, and xx, xxk, right? So what's the average? Well, if we have n different values of our random variable, then the sum of these divided by the total number of variables, total number of experiments, sorry, is equal to the average per experiment, right? So in n times p1 cases, my value is x1. In n times p2 cases, my value is x2, which means that if I wanted to summarize all the values which our random variable took during all these experiments, what would be my result? Well, my result would be n times p1 times x1 plus n times p2, p2 times x2 plus etc plus n times pk xk and divide it by number of experiments, right? Which is equal to, I will use the sigma sign pi xi where i from 1 to k, which means p1 x1 plus p2 x2 plus pn p3 x3 etc up to pk xk. So this is called a mathematical expectation or expected value of our random variable xi defined by this. So all we did is we calculated what's the average value of our random variable. We summarized all the values which it took during all the n experiments and n times p1 times it took value x1 etc etc, n times pk times it took the value xk and we summarized them and we divided by the number of experiments. That's average per experiment, right? That's how we calculate the average. We summarize all the values and divide it by the number of units which we have summarized, right? Basically, that's it. That's the definition. Now, does it mean that it's the right way to say, well, the random variable xi takes the average value of this? Well, that's not exactly correct. The word takes, it doesn't really necessarily take the value because as we saw, the value can be like minus 1, 38. It does not take this particular value. It takes either 1 or minus 36 or minus 1, right? But again, the average is whatever minus 1, 38. Now, the correct way is that considering we know what's the expected value of the random variable, all we can say is that if we will experiment many, many times and we will calculate the average per experiment value, then it will be close to this. And the more experiments we conduct, the closer our result, our average per experiment will be to this theoretical expected value or expectation or mathematical expectation. All right. So, next lecture I will probably devote to a couple of examples how to calculate this expected value. By the way, notice that this expected value basically lost its, well, dependency on elementary events. We don't really need elementary events. We need just the probabilities of these elementary events and the values which our random variable took if that particular experiment resulted in that particular elementary event. So, we can say slightly different. Okay, let's consider a random variable which takes the value x1 with the probability p1, x2 with probability p2, etc., and xk with probability pk. And what would be its expected value? Well, that's the formula basically. We have to summarize p1 times x1 plus p2 times x, etc. So, basically there is no elementary event involved in this particular phraseology, if you wish. All we need is the values of the random variable and the probabilities. It takes these values. And therefore, we can arrive with expected value. All right, that's it for today. I suggest you again to look at the notes which are on Unisor.com for this lecture. And that's it. Thank you very much. Good luck.