 Hi, I'm Zor. Welcome to Unizor education Continuing topics related to random variables and today's topic would be a variance of the sum of two random variables I do recommend you to watch this lecture on Unizor.com Website because it contains notes. Notes are very important. They're basically a short description of the lecture plus obviously this website contains exercises, exams for registered students, etc. So it's very beneficial to to work with the website itself, not from some other reference website or YouTube Alright, so we are talking about variance of the sum of two random variables Now we have already discussed that expectation or expected value or mean of the sum of two random variables is equal to sum of their expected values. So if you have two variables then their expectation is the expectation of their sum is sum of their expectations. Now obviously it would be very nice to see something like this with variances or standard deviation, etc. Well, it's not exactly like this. So let me first state the goal of this lecture. I would like to prove that the variance of the sum of two different random variables is equal to sum of their variances, but only if are independent. So independence is very important property. Now we discussed the independence of two random variables in the previous lecture so now I'm going to use it and one of the very important properties of the independent random variables is something like this. That the probability of one of them taking the value x and another simultaneously taking the value y is equal to the product of their probabilities. So this is something which was discussed in the previous lecture and that's the property which I'm going to use in this particular lecture to prove this equality for independent variables, random variables, c and eta. Alright, so let me just do in an ordinary fashion I have stated the goal of this lecture, so let's try to pursue it. Okay. First of all, let's introduce our random variables. Let's say we have a sample space which contains certain events. Now these events, elementary events have certain probabilities and let's say I have a random variable c defined on this sample space which takes the corresponding values of x1, x2, xm. So basically right now what we can say is we can really forget about the sample space and just concentrate on the random variable c which takes the values x1, etc. xm with probabilities p1, etc. pm. That's sufficient for our discussion. Now let's introduce another variable, random variable eta, which has values y1, etc. ym with probabilities q1, etc. qm. So we have two random variables and again we assume that these random variables are independent, which means that the probability of c taking one of these values and simultaneously eta taking one of these values is equal to the product of the corresponding probabilities as if taken separately. All right, now variance is basically or it's a weighted average of the squares of deviations That's how we can write it. If a is the expected value then c-a square is a new random variable which basically is a square of the deviation of the random variable from its mean value. And then if I am applying the expectation it means that I'm actually averaging with weights equal to probability. Now in other words, it's the following. So if a is the expectation of c which is as we know x1 times p1 plus, etc. plus xm times pm. All right, that's what the expectation of the c is. And that's a. Now this means x1 minus a square times p1 plus, etc. plus xm minus a square times pm. That's what it is. So we are weighting with weights p1, etc. pm, the different deviations of the values of our squares, squares of deviations of the values of our random variable from its expectation. All right. Now similarly the variance of eta would be y1 minus b where b is the average square q1. That's the probability of taking y1, etc. And the last one would be yn minus b square qn. Now these are very long expressions and I usually use the symbolic for the sum using the Greek letter sigma. So basically I can rewrite it as follows. This would be sigma, which is sum of xi minus a square pi where i is changing from 1 to m. Which means first we substitute 1, then 2, then 3, etc. and sum them together. So I will use this notation as the simpler one. Similarly in this case, I will also use the same notation and that would be sum of for j. I will use another index j. yj minus b square qj. Now let's talk now about the sum of these two random variables. Now sum of these random variables is xi plus eta. What kind of values this particular sum takes? Well, for instance, this can be x1 and this can be y1 or this can be x1 plus y2 or x1 plus y3, etc, etc, etc. So all the combination or xi2 and eta1, xi2 and eta2, etc, up to cm, I mean xm and y1 and xm and y2, etc, etc. So all the different pairs of different axes and different y's are actually possible. So this takes different xi plus yj. Sum of these two, where i is from m and j is from 1 to m. So i and j indices are changing all the different pairs. It's m times n different pairs. So we have m times n different values. All right. So first of all, what's the expectation? Well, that's easy. We know that the expectation of sum of two random variables equals to sum of their expectations. So it's a plus d in our notation. If you remember, we have assigned a as a value of expectation of xi and b is the expectation of eta. Now the variance is a little bit more difficult thing. So let me just calculate the variance of we don't really need these values m and n. We kind of understand them, right? So variance of xi plus eta. Well, what is it? Well, that's weighted average of the squares of deviations of this sum from the average. Now, this sum takes all kinds of values of these pairs. So I will do two summations, i from 1 to m and j from 1 to n. And for all different pairs of indices, i and j, I have this is the value of my variable. This is the expectation. So I have to square it. And multiply by the probability of this particular value, probability of xi plus yj. So what is the probability? Okay, probability of xi plus eta taking the value xi plus yj is the probability of xi taking the value xi and eta equals yj equals. Now you remember in the game, this is extremely important, xi and eta are independent variables. And therefore the probability of their simultaneous taking these two values equals to the product of their probabilities. Times p of eta equals to yj equals to pi times qj. Because this is the probability of xi taking xi is pi and probability of eta taking yj is qj. So here I can put pi qj. So this is basically the end of a creative part of this lecture. Everything else, whatever follows, is just technical algebraic transformation of this particular expression into something which resembles the sum of variance of xi of xi plus variance of eta. So all I have to do is just have some smart transformation of this. Well, let's try to do it. It's not such a big deal. Alright, so first let me just open up this square. Now I will put it separately. So sigma, sigma. Now if you don't mind, I'll put i and j and I assume that i is from 1 to m and j is from 1 to n. So it's just shorter for me. Now I will transform this slightly. I will put xi minus a plus yj minus b square pi qj equals. Alright, this is the same. I just regrouped them. Now this is sum of two numbers squared and I'll just open the square. It's square of this times double product this and this and square of this, right? So double sigma from i and j. So here I will have xi minus a square plus two xi minus a yj minus b plus yj minus b square p q i j equals. Well, now every member of this sum is multiplied by pi qj. So I can basically represent it as three different sigmas. First, I will do this plus. Then the second component times this to xi minus a yj minus b p q pi qj. And the third component is square of this. Square Okay. Great. Now, let's examine this thing. Well, you understand that whenever we are summarizing by i and j it doesn't really matter what's the order of these sigmas are. First, we summarize by i. Well, you can consider it as a matrix. So i is the index to the row and j is the index to the column and on the each cell on the intersection of row and column, we have this member. So now we are summarizing all the elements of this table. It doesn't matter how we summarize by column or by by row. So what I will do here, I will do the following. I will use this. First, I will put i and then j xi minus a square pi qj. This is the first member, right? I just change the summation. Now this thing does not depend on j, which means if we are summarizing by j, this is a constant. It's all i and i is here. So I can just move it outside of the first sigma. I forgot pi. Now, what is this? Sigma of all probabilities q basically is equal to 1, right? Because this is all the different probability, sum of all the probabilities, which Eta can take. So obviously the sum of all probabilities is equal to 1. So this is 1, which means that my first member, this one, is equal to this. And what is this? Well, this is war by definition, right? So this thing is actually variance of xi. Okay. Next. Next member is this one. Okay. Let's just consider what this actually is. Two, I can move outside of all the sigmas because it's just a multiplier. Every one of them is is multiplied. And then I have sum of products of all the different values of xi minus a times all the different products of yj minus a. What it actually represents is the following. Here is my statement that this is equal to product of these two things. Because if you again imagine what is this? This is a sum of m different members. And this is a sum of all different n members. This is a sum of their products where indices i and j are going through all the possible values. And that's exactly what happens if you multiply one sum by another, right? It's first times first, second times first, third times first, m's times first. And then second times first, second times second, etc, etc. So all the different pairs. This is exactly the same thing as this. That's what I'm saying. And what is this? Well, this is easy. Because you understand that you can rewrite this particular thing as x i pi i p i minus a p i equals x i p i minus a sigma p i, right? I just took a out of the sigma. Sigma of the difference is difference of sigmas. And then the second sigma I just factor out a. Now, what is this? This is equal to one. It's sum of all the different probabilities that c can take. And what is this? This is a. This is an expectation by definition. So it's a minus a, which is equal to zero. And absolutely similarly, this is also equal to zero. Although it doesn't really matter because if one multiplier is equal to zero, then the whole thing is equal to zero. And the whole thing is equal to zero. I wipe it out. Finally, I have my last member. And as you understand, it's very similar to the first one with similar results. So let me just show you how it is. So I have sigma j i y j minus b square p i q j. Now, why j minus b square and q j are constant while I'm summarizing by i, right? So it's equal to sigma by j y j minus b square p j, sorry, q j. And here I have sigma p i by i. Now, what is this? This is equal to one because it's sum of all the probabilities the random variables you can take. And what's left? Left is variance of eta. So instead of this, I can put variance of eta. Well, that's it. As I said, it's just a couple of relatively easy algebraic transformations of the expression for variance. Now, I was using sigma because it's really easier. Instead of sigma, I can put just completely sum of one to m or one to n. It would be really much more bulkier on the board. So sigma is really an easy thing to do and it's very easy to operate with. So that's it. That's the end of this particular lecture. And again, let's just remember very, very important property of these variances. The variance of the sum of two random variables is equal to the sum of their variances, but only under condition, we proved it only under condition that variables are independent. And that's why we could replace the probability of one taking some value and at the same time another taking some other value as a product of the corresponding individual probabilities because proof actually is based upon it. All right, so let me just suggest you to go again to Unisor.com and read the notes for this lecture. It would be just easier to understand and it will better inculcate in your mind. The proof actually is there written, so I do suggest you to read it. It's always beneficial for you. That's it. Thanks very much and good luck.