 Dear students, I would like to present to you the concept of the expected value of a random vector. Suppose that we have a random vector capital X and you know whenever we say random vector capital X, we write it in bold phase. It is generally acceptable that it is given in bold to convey that it is a vector. So I start again, let capital X bold be a random vector given by two random variables X1 and X2. Then the expected value of the vector X will exist if the expectations of X1 and X2 exist. And students, the formula or the expression is very interesting as you can see now on the screen. The expected value of the random vector X is the vector of the expected values of X1 and X2. Isn't that interesting? I repeat, the expected value of a random vector is the vector of the expected values of those random variables. So, let's take an example. Suppose that I toss two fair coins together using my right hand and I define the random variable X1 as the number of heads that I obtain. Then students, what will be the expected value of X1? Well, you can work it out using all the formulas that you know, I mean the formula that you should apply in this case. But let me tell you that you can do it very quickly. The expected value of X1 will be equal to 1. Why? You have tossed two fair coins together using your right hand. I can have head-head, head-tail, tail-head and tail-tail. So, if there is a head-head, X1 is equal to 2 because X1 is the number of heads. And if there is a head-tail or tail-head, X1 is equal to 1 because head-head is equal to 1 and tail-tail is equal to 0. Probabilities to head, they are 1 by 4, 2 by 4 and 1 by 4. And because of these probabilities that I have just now shared with you and you can find them yourself very quickly, you know that it is a symmetric distribution. And if it is a symmetric distribution, then the mean of that variable has to be at the exact center. So, 01 to the center is 1. Why did we say probabilities are 1 by 4 and 2 by 4 and 1 by 4? Because we are saying that those coins are fair. Now, suppose that with my left hand I am doing another experiment and that is that I am tossing a fair die. And I am denoting the number that I get on the die on the uppermost face. And this particular random variable I am denoting by X2. If X2 denotes the number of dots that you have on the uppermost face of your die, students, what will be the expected value of X2? Again, you can do the calculations but you should develop the skill to find it in your head very quickly and to be able to say that the expected value of X2 is equal to 3.5. Now, how did I say this? Well, because I said that it is a fair die. If it is a fair die, then 1, 2, 3, 4, 5 or 6, these 6 possible faces, each one of them has probability 1 by 6. So when each probability is 1 by 6, if it is a uniform distribution, discrete uniform distribution, again, if you draw its graph, you will be able to realize that if you put a mirror in the middle, the left hand side is the mirror image of the right hand side. And so it is symmetric and therefore the mean or the expected value of this particular random variable has to be in the exact middle. So 1, 2, 3, 4, 5, 6 or 3 or 4 come in the middle 3.5. Now I have told you the expected value of X1 or X2 or X2. This experiment was done separately. But now suppose I am doing the two experiments simultaneously. The one hand I am throwing, the other hand I am throwing the die. So then if you are considering these two variables X1 and X2 together in the form of a random vector, then we can say that the expected value of the random vector X is equal to the vector E of X1, E of X2. And now that we have the numerical values of E of X1 and E of X2, 1 and 3.5, we can insert them there and our final conclusion is that E of X where X is the random vector, it is the vector 1, 3.5. This is how we proceed when we are wanting to determine the expected value of a random vector.