 So in this example we're going to look at finding some minimal sufficient statistics. So we'll begin by recalling how we find minimal sufficient statistics in general using Fisher's Factorization Theorem. So basically you want to get the ratio of the distribution in terms of x and y and you look for when they are independent of whatever parameter you're interested in if and only if the statistic in terms of x is equal to the statistic in terms of y. So this is a very simple way of finding minimal sufficient statistics. So we'll begin with our first example looking at the normal distribution. So in the normal distribution if a single observation would have this distribution here so if I was to do the ratio of a single observation of x to a single observation of y we'd have f of x divided by f of y is equal to well this is to the power of minus a half here so 2 pi sigma squared to the power of minus a half e to minus x and then the x minus mu to be squared over 2 sigma squared and the same but in terms of y at the denominator. Okay basically you have something here that's the same above and below the line so we can get rid of it and make it simpler. So we have the exponential, we'll write it like the exponential and minus on the denominator becomes a plus on the numerator in terms of exponents so we have y minus mu squared over 2 sigma squared minus x minus mu to be squared over 2 sigma squared. Now that's a single observation but we're looking for something in terms of if we have more than one observation so we think in terms of a vector of observations instead becomes the product of this i equals 1 to n of the exponential of what we just have by i minus mu to be squared 2 sigma squared that's common minus x i minus mu to be squared and if we bring in this product it becomes a sum so we have the exponential of the sum of i equals 1 to n and we squared these terms out so y i squared minus 2 mu times the sum of y i plus n mu squared and then we have minus the sum of x i to be squared because it's x i squared and then minus so plus 2 mu the sum of y i x i I should say here and minus n mu squared and all of this is divided by 2 sigma squared so we tidy this up a bit so we have plus n mu squared minus n mu squared so they will cancel each other out so we simplify it so we have and gather terms in terms of so the sum from i equals 1 to n y i squared minus the sum from i equals 1 to n of x i squared and then we have let's say plus 2 mu times the sum of x i my equals 1 to n minus the sum of y i i equals 1 to n and all of this is divided by 2 sigma squared so this is constant with respect to mu if and only if the sum of x i is equal to the sum of y i so the sum of x i is a minimal sufficient statistic for mu and it's constant with respect to sigma squared if and only if and you have two things here that have to happen the sum of x i has to equal to the sum of y i because this has to essentially disappear and this has to disappear and the sum of x i squared is equal to the sum of y i squared so your minimal sufficient statistics for mu is the sum of x i or any simple function of the sum of x i for example the mean which is the sum of x i divided by your number of samples and your minimal sufficient statistic for sigma squared is the sum of x i i equals 1 to n and the sum of x i to b squared so that's how you calculate minimal sufficient statistics using the result of Fisher's factorization theorem