 Dear student, today we are going to learn the distribution of sample mean and the sample variance. Here the distribution of sample mean vector x bar and the sample covariance matrix S, vector S. Let x1 vector, x2 vector up to so on xn vector be a random sample from a multivariate normal with mean vector mu and variance covariance matrix sigma. That is the x is distributed with multivariate normal with mean vector mu and the variance covariance matrix sigma. Now by definition of sample mean vector is defined as you know that the sample mean which is equals to what? From x over n to sample mean vector we have basically 1 over n sum i varies 1 to n xi vector. Now taking expectation on both side, expectationally we have on both side, this is the random variable so expectation apply on random variable. As you know that the expected value of xi which is equals to mu in univariate case same as in the multivariate case this is the expected value of xi vector which is equals to the mean vector. The expected value xi key value enter key and you know that this is the constant value constant. Now cancel out these value which is equals to mean vector. So hence the expected value of x bar vector which is equals to mean, mean vector. Similarly, this is the sum of squares. You know that this is the sum of square x bar, x bar which is equals to, now I open this. Sum x i over n minus mean and here is the sum x i vector over n minus mean transpose. Now from here I am telling you sum of x i minus LCM, n times of mean vector divided by n here sum x i vector minus n times of vector transpose divided by n. Now here you have sum and here you have constant, if you apply on constant you know that n times is equal to 1 over n sum of i varies 1 to n x i minus mu similarly you have same factor 1 over n sum i varies 1 to n x i minus mean vector transpose. Now here you have the factor that we just opened the x bar, this is the sum of square. Further we have done that 1 over n, 1 over n, n square we have taken, i varies 1 to n, this is the j varies 1 to n, this is the i and j terms. So this is the x i minus mu x j minus mu, i varies 1 to n. So we have just changed it, subscript is basically dummy use, I keep i and j here, as such it has no effect. Now by definition what is the value of the covariance, now this is the covariance of x. By definition the covariance which is equals to the expected value x minus into expected value of x into x minus expected value of x transpose. Now by definition the covariance of x, x bar which is equal to expected value of x bar minus mu and x bar minus mu transpose. Since each x i is distributed independently with each x j, here the expected value of this, this is the mean and here you have i th and j th unit. The each x i is distributed independently each x j. You know that the covariance, if we have x and x j independently then its covariance is equal to zero. i th unit and here is the j th unit, it is independent then its covariance is equal to zero. The expected value of i th unit and j th unit transpose which is equals to zero, we know that if it is independent then its covariance is zero, for i is not equals to j condition what is i is not equals to j, if i equals to j then you will have this variance, if i is not equals to j then we have covariance, now covariance of x bar, this is the expected value of x bar minus mu, x bar minus mu transpose, where is this value, this value is here. Now we are going to put the values in the covariance of x bar, value entered and you know that expectation applied to random variable, expectation applied to this factor which is equals to what, this is the covariance term for the i th unit, so this is equals to sigma, sigma vector this is the sigma. Now you know that sigma is constant value, so the constant sum will be applied, this is the n times of sigma divided by n, here is the cancel out, now the covariance of x bar which is equals to sigma over n, that is the sample mean is distributed as multivariate normal with mean vector mu, sample mean distributed as multivariate normal with mean vector mu, here we have shown in expectation and the variance covariance matrix sigma over n, now we have shown that variance covariance matrix is sigma over n, that is the mean, sample mean is distributed as mean and the variance covariance matrix and distributed as multivariate normal with mean vector mu and variance covariance matrix sigma over n.