 In this video I will expand the previous video's principle to covariance matrices. So a correlation matrix, a special case of covariance matrix, that has been scaled so that the variance of its variable is 1. So correlation matrix is kind of like a standardized version of a covariance matrix. Some features of linear models are better understood in covariance matrix, so understanding the same set of rules in covariance form is useful. Let's take a look at the covariance between x1 and y. We calculate the covariance x1 and y exactly the same way as we calculated the correlation. So we take the unstandardized regression coefficient here. So previously we were working with standardized regression coefficients. These are now unstandardized because we are working on the raw metric instead of the correlation metric. So we have x1 to y, one path, we get the beta on, goes here. Then another way of x1 to y is to travel one correlation, covariance x1 to x2, so that's covariance. And then regression path, so we get that. And then x1 to x3, one covariance, and then to y. So that's the third. We sum those together, that gives us the covariance between x1 and y. And that's exactly the same math that we had in the correlation example. But instead of working with correlations, we work with covariances. Things get more interesting when we look at what is the variance of y. So the variance of y is given by that equation here. So the idea is that we go from y and then we go to each source of variance of y and then we come back. So we go from y to x1, we take the variance of x1 and then we come back. So it's variance of x1 times beta1 squared. In the correlation metric, we just take beta1 and beta1 squared, because the variance in the correlation matrix is 1, so we just ignore that. When we go from y to x1, x2, and x and beta2, then we get that here. And we go it both ways. Y, this is a useful rule, because it allows us to see that the variance of y is a sum of all these different sources of variation. So we get variation due to x, variation due to x, covariance between x and x2, we get variation due to the error term. So the variation of y is the sum of all these variances and covariances of the explanatory variables, plus the variance of u, the error term, that is uncorrelated to all the explanatory variables. This covariance form of the model implied correlation matrix rule is useful when you start working with more complicated models, such as confirmatory factor analysis models.